Search results for: mode superposition method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20521

Search results for: mode superposition method

19411 Applying Element Free Galerkin Method on Beam and Plate

Authors: Mahdad M’hamed, Belaidi Idir

Abstract:

This paper develops a meshless approach, called Element Free Galerkin (EFG) method, which is based on the weak form Moving Least Squares (MLS) of the partial differential governing equations and employs the interpolation to construct the meshless shape functions. The variation weak form is used in the EFG where the trial and test functions are approximated bye the MLS approximation. Since the shape functions constructed by this discretization have the weight function property based on the randomly distributed points, the essential boundary conditions can be implemented easily. The local weak form of the partial differential governing equations is obtained by the weighted residual method within the simple local quadrature domain. The spline function with high continuity is used as the weight function. The presently developed EFG method is a truly meshless method, as it does not require the mesh, either for the construction of the shape functions, or for the integration of the local weak form. Several numerical examples of two-dimensional static structural analysis are presented to illustrate the performance of the present EFG method. They show that the EFG method is highly efficient for the implementation and highly accurate for the computation. The present method is used to analyze the static deflection of beams and plate hole

Keywords: numerical computation, element-free Galerkin (EFG), moving least squares (MLS), meshless methods

Procedia PDF Downloads 283
19410 Measuring E-Learning Effectiveness Using a Three-Way Comparison

Authors: Matthew Montebello

Abstract:

The way e-learning effectiveness has been notoriously measured within an academic setting is by comparing the e-learning medium to the traditional face-to-face teaching methodology. In this paper, a simple yet innovative comparison methodology is introduced, whereby the effectiveness of next generation e-learning systems are assessed in contrast not only to the face-to-face mode, but also to the classical e-learning modality. Ethical and logistical issues are also discussed, as this three-way approach to compare teaching methodologies was applied and documented in a real empirical study within a higher education institution.

Keywords: e-learning effectiveness, higher education, teaching modality comparison

Procedia PDF Downloads 387
19409 Attitude Stabilization of Satellites Using Random Dither Quantization

Authors: Kazuma Okada, Tomoaki Hashimoto, Hirokazu Tahara

Abstract:

Recently, the effectiveness of random dither quantization method for linear feedback control systems has been shown in several papers. However, the random dither quantization method has not yet been applied to nonlinear feedback control systems. The objective of this paper is to verify the effectiveness of random dither quantization method for nonlinear feedback control systems. For this purpose, we consider the attitude stabilization problem of satellites using discrete-level actuators. Namely, this paper provides a control method based on the random dither quantization method for stabilizing the attitude of satellites using discrete-level actuators.

Keywords: quantized control, nonlinear systems, random dither quantization

Procedia PDF Downloads 243
19408 A Teaching Method for Improving Sentence Fluency in Writing

Authors: Manssour Habbash, Srinivasa Rao Idapalapati

Abstract:

Although writing is a multifaceted task, teaching writing is a demanding task basically for two reasons: Grammar and Syntax. This article provides a method of teaching writing that was found to be effective in improving students’ academic writing composition skill. The article explains the concepts of ‘guided-discovery’ and ‘guided-construction’ upon which a method of teaching writing is grounded and developed. Providing a brief commentary on what the core could mean primarily, the article presents an exposition of understanding and identifying the core and building upon the core that can demonstrate the way a teacher can make use of the concepts in teaching for improving the writing skills of their students. The method is an adaptation of grammar translation method that has been improvised to suit to a student-centered classroom environment. An intervention of teaching writing through this method was tried out with positive outcomes in formal classroom research setup, and in view of the content’s quality that relates more to the classroom practices and also in consideration of its usefulness to the practicing teachers the process and the findings are presented in a narrative form along with the results in tabular form.

Keywords: core of a text, guided construction, guided discovery, theme of a text

Procedia PDF Downloads 381
19407 An Investigation of the Operation and Performance of London Cycle Hire Scheme

Authors: Amer Ali, Jessica Cecchinelli, Antonis Charalambous

Abstract:

Cycling is one of the most environmentally friendly, economic and healthy modes of transport but it needs more efficient cycle infrastructure and more effective safety measures. This paper represents an investigation into the performance and operation of the London Cycle Hire Scheme which started to operate in July 2010 using 5,000 cycles and 315 docking stations and currently has more than 10,000 cycles and over 700 docking stations across London which are available 24/7, 365 days a year. The study, which was conducted during the second half of 2014, consists of two parts; namely, the longitudinal review of the hire scheme between its introduction in 2010 and November 2014, and a field survey in November 2014 in the form of face-face interviews of the users of the cycle scheme to ascertain the existing limitations and difficulties experienced by those users and how it could be improved in terms of capability and safety. The study also includes a correlation between the usage of the cycle scheme and the corresponding weather conditions. The main findings are that on average the number of users (hiring frequency) had increased from just over two millions hires in 2010 to just less than ten millions in 2014. The field survey showed that 80% of the users are satisfied with the performance of the scheme whilst 50% of the users raised concern about the safety level of using the available cycle routes and infrastructure. The study also revealed that a high percentage of the cycle trips were relatively short (less than 30 minutes). Although the weather condition had some effect on cycling, the cost of using the cycle scheme and the main events in London had more effect on the number of cycle hires. The key conclusions are that despite the safety concern and the lack of infrastructure for continuous routes there was an encouraging number of people who opted for cycling as a clean, affordable, and healthy mode of transport. There is a need to expand the scheme by providing more cycles and docking stations and to support that by more well-designed and maintained cycle routes. More details about the development of London Cycle Hire Scheme during the last five years, its performance and the key issues revealed by the surveyed users will be reported in the full version of the paper.

Keywords: cycling mode of transport, london cycle hire scheme, safety, environmental and health benefits, user satisfaction

Procedia PDF Downloads 387
19406 Level Set and Morphological Operation Techniques in Application of Dental Image Segmentation

Authors: Abdolvahab Ehsani Rad, Mohd Shafry Mohd Rahim, Alireza Norouzi

Abstract:

Medical image analysis is one of the great effects of computer image processing. There are several processes to analysis the medical images which the segmentation process is one of the challenging and most important step. In this paper the segmentation method proposed in order to segment the dental radiograph images. Thresholding method has been applied to simplify the images and to morphologically open binary image technique performed to eliminate the unnecessary regions on images. Furthermore, horizontal and vertical integral projection techniques used to extract the each individual tooth from radiograph images. Segmentation process has been done by applying the level set method on each extracted images. Nevertheless, the experiments results by 90% accuracy demonstrate that proposed method achieves high accuracy and promising result.

Keywords: integral production, level set method, morphological operation, segmentation

Procedia PDF Downloads 319
19405 A Reasoning Method of Cyber-Attack Attribution Based on Threat Intelligence

Authors: Li Qiang, Yang Ze-Ming, Liu Bao-Xu, Jiang Zheng-Wei

Abstract:

With the increasing complexity of cyberspace security, the cyber-attack attribution has become an important challenge of the security protection systems. The difficult points of cyber-attack attribution were forced on the problems of huge data handling and key data missing. According to this situation, this paper presented a reasoning method of cyber-attack attribution based on threat intelligence. The method utilizes the intrusion kill chain model and Bayesian network to build attack chain and evidence chain of cyber-attack on threat intelligence platform through data calculation, analysis and reasoning. Then, we used a number of cyber-attack events which we have observed and analyzed to test the reasoning method and demo system, the result of testing indicates that the reasoning method can provide certain help in cyber-attack attribution.

Keywords: reasoning, Bayesian networks, cyber-attack attribution, Kill Chain, threat intelligence

Procedia PDF Downloads 452
19404 Broadband Ultrasonic and Rheological Characterization of Liquids Using Longitudinal Waves

Authors: M. Abderrahmane Mograne, Didier Laux, Jean-Yves Ferrandis

Abstract:

Rheological characterizations of complex liquids like polymer solutions present an important scientific interest for a lot of researchers in many fields as biology, food industry, chemistry. In order to establish master curves (elastic moduli vs frequency) which can give information about microstructure, classical rheometers or viscometers (such as Couette systems) are used. For broadband characterization of the sample, temperature is modified in a very large range leading to equivalent frequency modifications applying the Time Temperature Superposition principle. For many liquids undergoing phase transitions, this approach is not applicable. That is the reason, why the development of broadband spectroscopic methods around room temperature becomes a major concern. In literature many solutions have been proposed but, to our knowledge, there is no experimental bench giving the whole rheological characterization for frequencies about a few Hz (Hertz) to many MHz (Mega Hertz). Consequently, our goal is to investigate in a nondestructive way in very broadband frequency (A few Hz – Hundreds of MHz) rheological properties using longitudinal ultrasonic waves (L waves), a unique experimental bench and a specific container for the liquid: a test tube. More specifically, we aim to estimate the three viscosities (longitudinal, shear and bulk) and the complex elastic moduli (M*, G* and K*) respectively longitudinal, shear and bulk moduli. We have decided to use only L waves conditioned in two ways: bulk L wave in the liquid or guided L waves in the tube test walls. In this paper, we will present first results for very low frequencies using the ultrasonic tracking of a falling ball in the test tube. This will lead to the estimation of shear viscosity from a few mPa.s to a few Pa.s (Pascal second). Corrections due to the small dimensions of the tube will be applied and discussed regarding the size of the falling ball. Then the use of bulk L wave’s propagation in the liquid and the development of a specific signal processing in order to assess longitudinal velocity and attenuation will conduct to the longitudinal viscosity evaluation in the MHz frequency range. At last, the first results concerning the propagation, the generation and the processing of guided compressional waves in the test tube walls will be discussed. All these approaches and results will be compared to standard methods available and already validated in our lab.

Keywords: nondestructive measurement for liquid, piezoelectric transducer, ultrasonic longitudinal waves, viscosities

Procedia PDF Downloads 265
19403 Use of Smartphone in Practical Classes to Facilitate Teaching and Learning of Microscopic Analysis and Interpretation of Tissues Sections

Authors: Lise P. Labéjof, Krisnayne S. Ribeiro, Nicolle P. dos Santos

Abstract:

An unrecorded experiment of use of the smartphone as a tool for practical classes of histology is presented in this article. Behavior, learning of the students of three science courses at the University were analyzed and compared as well as the mode of teaching of this discipline and the appreciation of the students, using either digital photographs taken by phone or drawings for record microscopic observations, analyze and interpret histological sections of human or animal tissues.

Keywords: cell phone, digital micrographies, learning of sciences, teaching practices

Procedia PDF Downloads 598
19402 Sensor Registration in Multi-Static Sonar Fusion Detection

Authors: Longxiang Guo, Haoyan Hao, Xueli Sheng, Hanjun Yu, Jingwei Yin

Abstract:

In order to prevent target splitting and ensure the accuracy of fusion, system error registration is an important step in multi-static sonar fusion detection system. To eliminate the inherent system errors including distance error and angle error of each sonar in detection, this paper uses offline estimation method for error registration. Suppose several sonars from different platforms work together to detect a target. The target position detected by each sonar is based on each sonar’s own reference coordinate system. Based on the two-dimensional stereo projection method, this paper uses real-time quality control (RTQC) method and least squares (LS) method to estimate sensor biases. The RTQC method takes the average value of each sonar’s data as the observation value and the LS method makes the least square processing of each sonar’s data to get the observation value. In the underwater acoustic environment, matlab simulation is carried out and the simulation results show that both algorithms can estimate the distance and angle error of sonar system. The performance of the two algorithms is also compared through the root mean square error and the influence of measurement noise on registration accuracy is explored by simulation. The system error convergence of RTQC method is rapid, but the distribution of targets has a serious impact on its performance. LS method can not be affected by target distribution, but the increase of random noise will slow down the convergence rate. LS method is an improvement of RTQC method, which is widely used in two-dimensional registration. The improved method can be used for underwater multi-target detection registration.

Keywords: data fusion, multi-static sonar detection, offline estimation, sensor registration problem

Procedia PDF Downloads 169
19401 Modal FDTD Method for Wave Propagation Modeling Customized for Parallel Computing

Authors: H. Samadiyeh, R. Khajavi

Abstract:

A new FD-based procedure, modal finite difference method (MFDM), is proposed for seismic wave propagation modeling, in which simulation is dealt with in the modal space. The method employs eigenvalues of a characteristic matrix formed by appropriate time-space FD stencils. Since MFD runs for different modes are totally independent of each other, MFDM can easily be parallelized while considerable simplicity in parallel-algorithm is also achieved. There is no requirement to any domain-decomposition procedure and inter-core data exchange. More important is the possibility to skip processing of less-significant modes, which enables one to adjust the procedure up to the level of accuracy needed. Thus, in addition to considerable ease of parallel programming, computation and storage costs are significantly reduced. The method is qualified for its efficiency by some numerical examples.

Keywords: Finite Difference Method, Graphics Processing Unit (GPU), Message Passing Interface (MPI), Modal, Wave propagation

Procedia PDF Downloads 297
19400 Stochastic Nuisance Flood Risk for Coastal Areas

Authors: Eva L. Suarez, Daniel E. Meeroff, Yan Yong

Abstract:

The U.S. Federal Emergency Management Agency (FEMA) developed flood maps based on experts’ experience and estimates of the probability of flooding. Current flood-risk models evaluate flood risk with regional and subjective measures without impact from torrential rain and nuisance flooding at the neighborhood level. Nuisance flooding occurs in small areas in the community, where a few streets or blocks are routinely impacted. This type of flooding event occurs when torrential rainstorm combined with high tide and sea level rise temporarily exceeds a given threshold. In South Florida, this threshold is 1.7 ft above Mean Higher High Water (MHHW). The National Weather Service defines torrential rain as rain deposition at a rate greater than 0.3-inches per hour or three inches in a single day. Data from the Florida Climate Center, 1970 to 2020, shows 371 events with more than 3-inches of rain in a day in 612 months. The purpose of this research is to develop a data-driven method to determine comprehensive analytical damage-avoidance criteria that account for nuisance flood events at the single-family home level. The method developed uses the Failure Mode and Effect Analysis (FMEA) method from the American Society of Quality (ASQ) to estimate the Damage Avoidance (DA) preparation for a 1-day 100-year storm. The Consequence of Nuisance Flooding (CoNF) is estimated from community mitigation efforts to prevent nuisance flooding damage. The Probability of Nuisance Flooding (PoNF) is derived from the frequency and duration of torrential rainfall causing delays and community disruptions to daily transportation, human illnesses, and property damage. Urbanization and population changes are related to the U.S. Census Bureau's annual population estimates. Data collected by the United States Department of Agriculture (USDA) Natural Resources Conservation Service’s National Resources Inventory (NRI) and locally by the South Florida Water Management District (SFWMD) track the development and land use/land cover changes with time. The intent is to include temporal trends in population density growth and the impact on land development. Results from this investigation provide the risk of nuisance flooding as a function of CoNF and PoNF for coastal areas of South Florida. The data-based criterion provides awareness to local municipalities on their flood-risk assessment and gives insight into flood management actions and watershed development.

Keywords: flood risk, nuisance flooding, urban flooding, FMEA

Procedia PDF Downloads 100
19399 Estimation of Dynamic Characteristics of a Middle Rise Steel Reinforced Concrete Building Using Long-Term

Authors: Fumiya Sugino, Naohiro Nakamura, Yuji Miyazu

Abstract:

In earthquake resistant design of buildings, evaluation of vibration characteristics is important. In recent years, due to the increment of super high-rise buildings, the evaluation of response is important for not only the first mode but also higher modes. The knowledge of vibration characteristics in buildings is mostly limited to the first mode and the knowledge of higher modes is still insufficient. In this paper, using earthquake observation records of a SRC building by applying frequency filter to ARX model, characteristics of first and second modes were studied. First, we studied the change of the eigen frequency and the damping ratio during the 3.11 earthquake. The eigen frequency gradually decreases from the time of earthquake occurrence, and it is almost stable after about 150 seconds have passed. At this time, the decreasing rates of the 1st and 2nd eigen frequencies are both about 0.7. Although the damping ratio has more large error than the eigen frequency, both the 1st and 2nd damping ratio are 3 to 5%. Also, there is a strong correlation between the 1st and 2nd eigen frequency, and the regression line is y=3.17x. In the damping ratio, the regression line is y=0.90x. Therefore 1st and 2nd damping ratios are approximately the same degree. Next, we study the eigen frequency and damping ratio from 1998 after 3.11 earthquakes, the final year is 2014. In all the considered earthquakes, they are connected in order of occurrence respectively. The eigen frequency slowly declined from immediately after completion, and tend to stabilize after several years. Although it has declined greatly after the 3.11 earthquake. Both the decresing rate of the 1st and 2nd eigen frequencies until about 7 years later are about 0.8. For the damping ratio, both the 1st and 2nd are about 1 to 6%. After the 3.11 earthquake, the 1st increases by about 1% and the 2nd increases by less than 1%. For the eigen frequency, there is a strong correlation between the 1st and 2nd, and the regression line is y=3.17x. For the damping ratio, the regression line is y=1.01x. Therefore, it can be said that the 1st and 2nd damping ratio is approximately the same degree. Based on the above results, changes in eigen frequency and damping ratio are summarized as follows. In the long-term study of the eigen frequency, both the 1st and 2nd gradually declined from immediately after completion, and tended to stabilize after a few years. Further it declined after the 3.11 earthquake. In addition, there is a strong correlation between the 1st and 2nd, and the declining time and the decreasing rate are the same degree. In the long-term study of the damping ratio, both the 1st and 2nd are about 1 to 6%. After the 3.11 earthquake, the 1st increases by about 1%, the 2nd increases by less than 1%. Also, the 1st and 2nd are approximately the same degree.

Keywords: eigenfrequency, damping ratio, ARX model, earthquake observation records

Procedia PDF Downloads 217
19398 Radial Distortion Correction Based on the Concept of Verifying the Planarity of a Specimen

Authors: Shih-Heng Tung, Ming-Hsiang Shih, Wen-Pei Sung

Abstract:

Because of the rapid development of digital camera and computer, digital image correlation method has drawn lots of attention recently and has been applied to a variety of fields. However, the image distortion is inevitable when the image is captured through a lens. This image distortion problem can result in an innegligible error while using digital image correlation method. There are already many different ways to correct the image distortion, and most of them require specific image patterns or precise control points. A new distortion correction method is proposed in this study. The proposed method is based on the fact that a flat surface should keep flat when it is measured using three-dimensional (3D) digital image measurement technique. Lens distortion can be divided into radial distortion, decentering distortion and thin prism distortion. Because radial distortion has a more noticeable influence than the other types of distortions, this method deals only with radial distortion. The simplified 3D digital image measurement technique is adopted to measure the surface coordinates of a flat specimen. Then the gradient method is applied to find the best correction parameters. A few experiments are carried out in this study to verify the correctness of this method. The results show that this method can achieve a good accuracy and it is suitable for both large and small distortion conditions. The most important advantage is that it requires neither mark with specific pattern nor precise control points.

Keywords: 3D DIC, radial distortion, distortion correction, planarity

Procedia PDF Downloads 551
19397 Solving 94-Bit ECDLP with 70 Computers in Parallel

Authors: Shunsuke Miyoshi, Yasuyuki Nogami, Takuya Kusaka, Nariyoshi Yamai

Abstract:

Elliptic curve discrete logarithm problem (ECDLP) is one of problems on which the security of pairing-based cryptography is based. This paper considers Pollard's rho method to evaluate the security of ECDLP on Barreto-Naehrig (BN) curve that is an efficient pairing-friendly curve. Some techniques are proposed to make the rho method efficient. Especially, the group structure on BN curve, distinguished point method, and Montgomery trick are well-known techniques. This paper applies these techniques and shows its optimization. According to the experimental results for which a large-scale parallel system with MySQL is applied, 94-bit ECDLP was solved about 28 hours by parallelizing 71 computers.

Keywords: Pollard's rho method, BN curve, Montgomery multiplication

Procedia PDF Downloads 272
19396 A Method for Multimedia User Interface Design for Mobile Learning

Authors: Shimaa Nagro, Russell Campion

Abstract:

Mobile devices are becoming ever more widely available, with growing functionality, and are increasingly used as an enabling technology to give students access to educational material anytime and anywhere. However, the design of educational material user interfaces for mobile devices is beset by many unresolved research issues such as those arising from emphasising the information concepts then mapping this information to appropriate media (modelling information then mapping media effectively). This report describes a multimedia user interface design method for mobile learning. The method covers specification of user requirements and information architecture, media selection to represent the information content, design for directing attention to important information, and interaction design to enhance user engagement based on Human-Computer Interaction design strategies (HCI). The method will be evaluated by three different case studies to prove the method is suitable for application to different areas / applications, these are; an application to teach about major computer networking concepts, an application to deliver a history-based topic; (after these case studies have been completed, the method will be revised to remove deficiencies and then used to develop a third case study), an application to teach mathematical principles. At this point, the method will again be revised into its final format. A usability evaluation will be carried out to measure the usefulness and effectiveness of the method. The investigation will combine qualitative and quantitative methods, including interviews and questionnaires for data collection and three case studies for validating the MDMLM method. The researcher has successfully produced the method at this point which is now under validation and testing procedures. From this point forward in the report, the researcher will refer to the method using the MDMLM abbreviation which means Multimedia Design Mobile Learning Method.

Keywords: human-computer interaction, interface design, mobile learning, education

Procedia PDF Downloads 247
19395 Engineering Method to Measure the Impact Sound Improvement with Floor Coverings

Authors: Katarzyna Baruch, Agata Szelag, Jaroslaw Rubacha, Bartlomiej Chojnacki, Tadeusz Kamisinski

Abstract:

Methodology used to measure the reduction of transmitted impact sound by floor coverings situated on a massive floor is described in ISO 10140-3: 2010. To carry out such tests, the standardised reverberation room separated by a standard floor from the second measuring room are required. The need to have a special laboratory results in high cost and low accessibility of this measurement. The authors propose their own engineering method to measure the impact sound improvement with floor coverings. This method does not require standard rooms and floor. This paper describes the measurement procedure of proposed engineering method. Further, verification tests were performed. Validation of the proposed method was based on the analytical model, Statistical Energy Analysis (SEA) model and empirical measurements. The received results were related to corresponding ones obtained from ISO 10140-3:2010 measurements. The study confirmed the usefulness of the engineering method.

Keywords: building acoustic, impact noise, impact sound insulation, impact sound transmission, reduction of impact sound

Procedia PDF Downloads 324
19394 Stabilization of Rotational Motion of Spacecrafts Using Quantized Two Torque Inputs Based on Random Dither

Authors: Yusuke Kuramitsu, Tomoaki Hashimoto, Hirokazu Tahara

Abstract:

The control problem of underactuated spacecrafts has attracted a considerable amount of interest. The control method for a spacecraft equipped with less than three control torques is useful when one of the three control torques had failed. On the other hand, the quantized control of systems is one of the important research topics in recent years. The random dither quantization method that transforms a given continuous signal to a discrete signal by adding artificial random noise to the continuous signal before quantization has also attracted a considerable amount of interest. The objective of this study is to develop the control method based on random dither quantization method for stabilizing the rotational motion of a rigid spacecraft with two control inputs. In this paper, the effectiveness of random dither quantization control method for the stabilization of rotational motion of spacecrafts with two torque inputs is verified by numerical simulations.

Keywords: spacecraft control, quantized control, nonlinear control, random dither method

Procedia PDF Downloads 180
19393 Effect of Non-Regulated pH on the Dynamics of Dark Fermentative Biohydrogen Production with Suspended and Immobilized Cell Culture

Authors: Joelle Penniston, E. B. Gueguim-Kana

Abstract:

Biohydrogen has been identified as a promising alternative to the use of non-renewable fossil reserves, owing to its sustainability and non-polluting nature. pH is considered as a key parameter in fermentative biohydrogen production processes, due to its effect on the hydrogenase activity, metabolic activity as well as substrate hydrolysis. The present study assesses the influence of regulating pH on dark fermentative biohydrogen production. Four experimental hydrogen production schemes were evaluated. Two were implemented using suspended cells under regulated pH growth conditions (Sus_R) and suspended and non-regulated pH (Sus_N). The two others regimes consisted of alginate immobilized cells under pH regulated growth conditions (Imm_R) and immobilized and non-pH regulated conditions (Imm_N). All experiments were carried out at 37.5°C with glucose as sole source of carbon. Sus_R showed a lag time of 5 hours and a peak hydrogen fraction of 36% and a glucose degradation of 37%, compared to Sus_N which showed a peak hydrogen fraction of 44% and complete glucose degradation. Both suspended culture systems showed a higher peak biohydrogen fraction compared to the immobilized cell system. Imm_R experiments showed a lag phase of 8 hours, a peak biohydrogen fraction of 35%, while Imm_N showed a lag phase of 5 hours, a peak biohydrogen fraction of 22%. 100% glucose degradation was observed in both pH regulated and non-regulated processes. This study showed that biohydrogen production in batch mode with suspended cells in a non-regulated pH environment results in a partial degradation of substrate, with lower yield. This scheme has been the culture mode of choice for most reported studies in biohydrogen research. The relatively lower slope in pH trend of the non-regulated pH experiment with immobilized cells (Imm_N) compared to Sus_N revealed that that immobilized systems have a better buffering capacity compared to suspended systems, which allows for the extended production of biohydrogen even under non-regulated pH conditions. However, alginate immobilized cultures in flask systems showed some drawbacks associated to high rate of gas production that leads to increased buoyancy of the immobilization beads. This ultimately impedes the release of gas out of the flask.

Keywords: biohydrogen, sustainability, suspended, immobilized

Procedia PDF Downloads 342
19392 Modeling of the Heat and Mass Transfer in Fluids through Thermal Pollution in Pipelines

Authors: V. Radulescu, S. Dumitru

Abstract:

Introduction: Determination of the temperature field inside a fluid in motion has many practical issues, especially in the case of turbulent flow. The phenomenon is greater when the solid walls have a different temperature than the fluid. The turbulent heat and mass transfer have an essential role in case of the thermal pollution, as it was the recorded during the damage of the Thermoelectric Power-plant Oradea (closed even today). Basic Methods: Solving the theoretical turbulent thermal pollution represents a particularly difficult problem. By using the semi-empirical theories or by simplifying the made assumptions, based on the experimental measurements may be assured the elaboration of the mathematical model for further numerical simulations. The three zones of flow are analyzed separately: the vicinity of the solid wall, the turbulent transition zone, and the turbulent core. For each area are determined the distribution law of temperature. It is determined the dependence of between the Stanton and Prandtl numbers with correction factors, based on measurements experimental. Major Findings/Results: The limitation of the laminar thermal substrate was determined based on the theory of Landau and Levice, using the assumption that the longitudinal component of the velocity pulsation and the pulsation’s frequency varies proportionally with the distance to the wall. For the calculation of the average temperature, the formula is used a similar solution as for the velocity, by an analogous mediation. On these assumptions, the numerical modeling was performed with a gradient of temperature for the turbulent flow in pipes (intact or damaged, with cracks) having 4 different diameters, between 200-500 mm, as there were in the Thermoelectric Power-plant Oradea. Conclusions: It was made a superposition between the molecular viscosity and the turbulent one, followed by addition between the molecular and the turbulent transfer coefficients, necessary to elaborate the theoretical and the numerical modeling. The concept of laminar boundary layer has a different thickness when it is compared the flow with heat transfer and that one without a temperature gradient. The obtained results are within the margin of error of 5%, between the semi-empirical classical theories and the developed model, based on the experimental data. Finally, it is obtained a general correlation between the Stanton number and the Prandtl number, for a specific flow (with associated Reynolds number).

Keywords: experimental measurements, numerical correlations, thermal pollution through pipelines, turbulent thermal flow

Procedia PDF Downloads 165
19391 Prevalence and Correlates of Complementary and Alternative Medicine Use among Diabetic Patients in Lebanon: A Cross-Sectional Study

Authors: Farah Naja, Mohamad Alameddine

Abstract:

Background: The difficulty of compliance to therapeutic and lifestyle management of type 2 diabetes mellitus (T2DM) encourages patients to use complementary and alternative medicine (CAM) therapies. Little is known about the prevalence and mode of CAM use among diabetics in the Eastern Mediterranean Region in general and Lebanon in particular. Objective: To assess the prevalence and modes of CAM use among patients with T2DM residing in Beirut, Lebanon. Methods: A cross-sectional survey of T2DM patients was conducted on patients recruited from two major referral centers - a public hospital and a private academic medical center in Beirut. In a face-to-face interview, participants completed a survey questionnaire comprised of three sections: socio-demographic, diabetes characteristics and types and modes of CAM use. Descriptive statistics, univariate and multivariate logistic regression analyses were utilized to assess the prevalence, mode and correlates of CAM use in the study population. The main outcome in this study (CAM use) was defined as using CAM at least once since diagnosis with T2DM. Results: A total of 333 T2DM patients completed the survey (response rate: 94.6%). Prevalence of CAM use in the study population was 38%, 95% CI (33.1-43.5). After adjustment, CAM use was significantly associated with a “married” status, a longer duration of T2DM, the presence of disease complications, and a positive family history of the disease. Folk foods and herbs were the most commonly used CAM followed by natural health products. One in five patients used CAM as an alternative to conventional treatment. Only 7 % of CAM users disclosed the CAM use to their treating physician. Health care practitioners were the least cited (7%) as influencing the choice of CAM among users. Conclusion: The use of CAM therapies among T2DM patients in Lebanon is prevalent. Decision makers and care providers must fully understand the potential risks and benefits of CAM therapies to appropriately advise their patients. Attention must be dedicated to educating T2DM patients on the importance of disclosing CAM use to their physicians especially patients with a family history of diabetes, and those using conventional therapy for a long time.

Keywords: nutritional supplements, type 2 diabetes mellitus, complementary and alternative medicine (CAM), conventional therapy

Procedia PDF Downloads 350
19390 Using the Cluster Computing to Improve the Computational Speed of the Modular Exponentiation in RSA Cryptography System

Authors: Te-Jen Chang, Ping-Sheng Huang, Shan-Ten Cheng, Chih-Lin Lin, I-Hui Pan, Tsung- Hsien Lin

Abstract:

RSA system is a great contribution for the encryption and the decryption. It is based on the modular exponentiation. We call this system as “a large of numbers for calculation”. The operation of a large of numbers is a very heavy burden for CPU. For increasing the computational speed, in addition to improve these algorithms, such as the binary method, the sliding window method, the addition chain method, and so on, the cluster computer can be used to advance computational speed. The cluster system is composed of the computers which are installed the MPICH2 in laboratory. The parallel procedures of the modular exponentiation can be processed by combining the sliding window method with the addition chain method. It will significantly reduce the computational time of the modular exponentiation whose digits are more than 512 bits and even more than 1024 bits.

Keywords: cluster system, modular exponentiation, sliding window, addition chain

Procedia PDF Downloads 524
19389 An Approach for Determining and Reducing Vehicle Turnaround Time for Outbound Logistics by Using Critical Path Method

Authors: Prajakta M. Wazat, D. N. Raut

Abstract:

The study consists of a fast moving consumer goods (FMCG) beverage company wherein a portion of the supply chain which deals with outbound logistics is taken for improvement in order to reduce its logistics cost by using critical path method (CPM) method. Logistics is a major portion of the supply chain where customers are not willing to pay as it adds cost to product without adding value. In this study, it is necessary to ensure that products are delivered to clients at the right time while preserving high-quality standards from the beginning to the end of the supply chain. CPM is a logical sequencing method where in the most efficient route is achieved by arranging the series of events. CPM enables to identify a critical factor in order to minimize the delays and interruption by providing a feasible solution.

Keywords: FMCG, supply chain, outbound logistics, vehicle turnaround time, critical path method, cost reduction

Procedia PDF Downloads 167
19388 Element-Independent Implementation for Method of Lagrange Multipliers

Authors: Gil-Eon Jeong, Sung-Kie Youn, K. C. Park

Abstract:

Treatment for the non-matching interface is an important computational issue. To handle this problem, the method of Lagrange multipliers including classical and localized versions are the most popular technique. It essentially imposes the interface compatibility conditions by introducing Lagrange multipliers. However, the numerical system becomes unstable and inefficient due to the Lagrange multipliers. The interface element-independent formulation that does not include the Lagrange multipliers can be obtained by modifying the independent variables mathematically. Through this modification, more efficient and stable system can be achieved while involving equivalent accuracy comparing with the conventional method. A numerical example is conducted to verify the validity of the presented method.

Keywords: element-independent formulation, interface coupling, methods of Lagrange multipliers, non-matching interface

Procedia PDF Downloads 404
19387 Case Study on Innovative Aquatic-Based Bioeconomy for Chlorella sorokiniana

Authors: Iryna Atamaniuk, Hannah Boysen, Nils Wieczorek, Natalia Politaeva, Iuliia Bazarnova, Kerstin Kuchta

Abstract:

Over the last decade due to climate change and a strategy of natural resources preservation, the interest for the aquatic biomass has dramatically increased. Along with mitigation of the environmental pressure and connection of waste streams (including CO2 and heat emissions), microalgae bioeconomy can supply food, feed, as well as the pharmaceutical and power industry with number of value-added products. Furthermore, in comparison to conventional biomass, microalgae can be cultivated in wide range of conditions without compromising food and feed production, thus addressing issues associated with negative social and the environmental impacts. This paper presents the state-of-the art technology for microalgae bioeconomy from cultivation process to production of valuable components and by-streams. Microalgae Chlorella sorokiniana were cultivated in the pilot-scale innovation concept in Hamburg (Germany) using different systems such as race way pond (5000 L) and flat panel reactors (8 x 180 L). In order to achieve the optimum growth conditions along with suitable cellular composition for the further extraction of the value-added components, process parameters such as light intensity, temperature and pH are continuously being monitored. On the other hand, metabolic needs in nutrients were provided by addition of micro- and macro-nutrients into a medium to ensure autotrophic growth conditions of microalgae. The cultivation was further followed by downstream process and extraction of lipids, proteins and saccharides. Lipids extraction is conducted in repeated-batch semi-automatic mode using hot extraction method according to Randall. As solvents hexane and ethanol are used at different ratio of 9:1 and 1:9, respectively. Depending on cell disruption method along with solvents ratio, the total lipids content showed significant variations between 8.1% and 13.9 %. The highest percentage of extracted biomass was reached with a sample pretreated with microwave digestion using 90% of hexane and 10% of ethanol as solvents. Proteins content in microalgae was determined by two different methods, namely: Total Kejadahl Nitrogen (TKN), which further was converted to protein content, as well as Bradford method using Brilliant Blue G-250 dye. Obtained results, showed a good correlation between both methods with protein content being in the range of 39.8–47.1%. Characterization of neutral and acid saccharides from microalgae was conducted by phenol-sulfuric acid method at two wavelengths of 480 nm and 490 nm. The average concentration of neutral and acid saccharides under the optimal cultivation conditions was 19.5% and 26.1%, respectively. Subsequently, biomass residues are used as substrate for anaerobic digestion on the laboratory-scale. The methane concentration, which was measured on the daily bases, showed some variations for different samples after extraction steps but was in the range between 48% and 55%. CO2 which is formed during the fermentation process and after the combustion in the Combined Heat and Power unit can potentially be used within the cultivation process as a carbon source for the photoautotrophic synthesis of biomass.

Keywords: bioeconomy, lipids, microalgae, proteins, saccharides

Procedia PDF Downloads 246
19386 CFD Modeling of Boiling in a Microchannel Based On Phase-Field Method

Authors: Rahim Jafari, Tuba Okutucu-Özyurt

Abstract:

The hydrodynamics and heat transfer characteristics of a vaporized elongated bubble in a rectangular microchannel have been simulated based on Cahn-Hilliard phase-field method. In the simulations, the initially nucleated bubble starts growing as it comes in contact with superheated water. The growing shape of the bubble compared with the available experimental data in the literature.

Keywords: microchannel, boiling, Cahn-Hilliard method, simulation

Procedia PDF Downloads 425
19385 Assessing Significance of Correlation with Binomial Distribution

Authors: Vijay Kumar Singh, Pooja Kushwaha, Prabhat Ranjan, Krishna Kumar Ojha, Jitendra Kumar

Abstract:

Present day high-throughput genomic technologies, NGS/microarrays, are producing large volume of data that require improved analysis methods to make sense of the data. The correlation between genes and samples has been regularly used to gain insight into many biological phenomena including, but not limited to, co-expression/co-regulation, gene regulatory networks, clustering and pattern identification. However, presence of outliers and violation of assumptions underlying Pearson correlation is frequent and may distort the actual correlation between the genes and lead to spurious conclusions. Here, we report a method to measure the strength of association between genes. The method assumes that the expression values of a gene are Bernoulli random variables whose outcome depends on the sample being probed. The method considers the two genes as uncorrelated if the number of sample with same outcome for both the genes (Ns) is equal to certainly expected number (Es). The extent of correlation depends on how far Ns can deviate from the Es. The method does not assume normality for the parent population, fairly unaffected by the presence of outliers, can be applied to qualitative data and it uses the binomial distribution to assess the significance of association. At this stage, we would not claim about the superiority of the method over other existing correlation methods, but our method could be another way of calculating correlation in addition to existing methods. The method uses binomial distribution, which has not been used until yet, to assess the significance of association between two variables. We are evaluating the performance of our method on NGS/microarray data, which is noisy and pierce by the outliers, to see if our method can differentiate between spurious and actual correlation. While working with the method, it has not escaped our notice that the method could also be generalized to measure the association of more than two variables which has been proven difficult with the existing methods.

Keywords: binomial distribution, correlation, microarray, outliers, transcriptome

Procedia PDF Downloads 416
19384 Study of the Electromagnetic Resonances of a Cavity with an Aperture Using Numerical Method and Equivalent Circuit Method

Authors: Ming-Chu Yin, Ping-An Du

Abstract:

The shielding ability of a shielding cavity is affected greatly by its resonances, which include resonance modes and frequencies. The equivalent circuit method and numerical method of transmission line matrix (TLM) are used to analyze the effect of aperture-cavity coupling on electromagnetic resonances of a cavity with an aperture in this paper. Both theoretical and numerical results show that the resonance modes of a shielding cavity with an aperture can be considered as the combination of cavity and aperture inherent resonance modes with resonance frequencies shifting, and the reason of this shift is aperture-cavity coupling. Because aperture sizes are important parameters to aperture-cavity coupling, variation rules of electromagnetic resonances of a shielding cavity with its aperture sizes are given, which will be useful for the design of shielding cavities.

Keywords: aperture-cavity coupling, equivalent circuit method, resonances, shielding equipment

Procedia PDF Downloads 445
19383 Cleaning Performance of High-Frequency, High-Intensity 360 kHz Frequency Operating in Thickness Mode Transducers

Authors: R. Vetrimurugan, Terry Lim, M. J. Goodson, R. Nagarajan

Abstract:

This study investigates the cleaning performance of high intensity 360 kHz frequency on the removal of nano-dimensional and sub-micron particles from various surfaces, uniformity of the cleaning tank and run to run variation of cleaning process. The uniformity of the cleaning tank was measured by two different methods i.e 1. ppbTM meter and 2. Liquid Particle Counting (LPC) technique. In the second method, aluminium metal spacer components was placed at various locations of the cleaning tank (such as centre, top left corner, bottom left corner, top right corner, bottom right corner) and the resultant particles removed by 360 kHz frequency was measured. The result indicates that the energy was distributed more uniformly throughout the entire cleaning vessel even at the corners and edges of the tank when megasonic sweeping technology is applied. The result also shows that rinsing the parts with 360 kHz frequency at final rinse gives lower particle counts, hence higher cleaning efficiency as compared to other frequencies. When megasonic sweeping technology is applied each piezoelectric transducers will operate at their optimum resonant frequency and generates stronger acoustic cavitational force and higher acoustic streaming velocity. These combined forces are helping to enhance the particle removal and at the same time improve the overall cleaning performance. The multiple extractions study was also carried out for various frequencies to measure the cleaning potential and asymptote value.

Keywords: power distribution, megasonic sweeping, cavitation intensity, particle removal, laser particle counting, nano, submicron

Procedia PDF Downloads 418
19382 Synthesis, Characterization and Biological Properties of Half-Sandwich Complexes of Ruthenium(II), Rhodium(II) and Iridium(III)

Authors: A. Gilewska, J. Masternak, K. Kazimierczuk, L. Turlej, J. Wietrzyk, B. Barszcz

Abstract:

Platinum-based drugs are now widely used as chemotherapeutic agents. However the platinum complexes show the toxic side-effects: i) the development of platinum resistance; ii) the occurrence of severe side effects, such as nephro-, neuro- and ototoxicity; iii) the high toxicity towards human fibroblast. Therefore the development of new anticancer drugs containing different transition-metal ions, for example, ruthenium, rhodium, iridium is a valid strategy in cancer treatment. In this paper, we reported the synthesis, spectroscopic, structural and biological properties of complexes of ruthenium, rhodium, and iridium containing N,N-chelating ligand (2,2’-bisimidazole). These complexes were characterized by elemental analysis, UV-Vis and IR spectroscopy, X-ray diffraction analysis. These complexes exhibit a typical pseudotetrahedral three-legged piano-stool geometry, in which the aromatic arene ring forms the seat of the piano-stool, while the bidentate 2,2’-bisimidazole (ligand) and the one chlorido ligand form the three legs of the stool. The spectroscopy data (IR, UV-Vis) and elemental analysis correlate very well with molecular structures. Moreover, the cytotoxic activity of the complexes was carried out on human cancer cell lines: LoVo (colorectal adenoma), MV-4-11 (myelomonocytic leukaemia), MCF-7 (breast adenocarcinoma) and normal healthy mouse fibroblast BALB/3T3 cell lines. To predict a binding mode, a potential interaction of metal complexes with calf thymus DNA (CT-DNA) and protein (BSA) has been explored using UV absorption and circular dichroism (CD). It is interesting to note that the investigated complexes show no cytotoxic effect towards the normal BALB/3T3 cell line, compared to cisplatin, which IC₅₀ values was determined as 2.20 µM. Importantly, Ru(II) displayed the highest activity against HL-60 (IC₅₀ 4.35 µM). The biological studies (UV-Vis and circular dichroism) suggest that arene-complexes could interact with calf thymus DNA probably via an outside binding mode and interact with protein (BSA).

Keywords: ruthenium(II) complex, rhodium(III) complex, iridium(III) complex, biological activity

Procedia PDF Downloads 137