Search results for: time to surgery
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18583

Search results for: time to surgery

13753 Peptide-Based Platform for Differentiation of Antigenic Variations within Influenza Virus Subtypes (Flutype)

Authors: Henry Memczak, Marc Hovestaedt, Bernhard Ay, Sandra Saenger, Thorsten Wolff, Frank F. Bier

Abstract:

The influenza viruses cause flu epidemics every year and serious pandemics in larger time intervals. The only cost-effective protection against influenza is vaccination. Due to rapid mutation continuously new subtypes appear, what requires annual reimmunization. For a correct vaccination recommendation, the circulating influenza strains had to be detected promptly and exactly and characterized due to their antigenic properties. During the flu season 2016/17, a wrong vaccination recommendation has been given because of the great time interval between identification of the relevant influenza vaccine strains and outbreak of the flu epidemic during the following winter. Due to such recurring incidents of vaccine mismatches, there is a great need to speed up the process chain from identifying the right vaccine strains to their administration. The monitoring of subtypes as part of this process chain is carried out by national reference laboratories within the WHO Global Influenza Surveillance and Response System (GISRS). To this end, thousands of viruses from patient samples (e.g., throat smears) are isolated and analyzed each year. Currently, this analysis involves complex and time-intensive (several weeks) animal experiments to produce specific hyperimmune sera in ferrets, which are necessary for the determination of the antigen profiles of circulating virus strains. These tests also bear difficulties in standardization and reproducibility, which restricts the significance of the results. To replace this test a peptide-based assay for influenza virus subtyping from corresponding virus samples was developed. The differentiation of the viruses takes place by a set of specifically designed peptidic recognition molecules which interact differently with the different influenza virus subtypes. The differentiation of influenza subtypes is performed by pattern recognition guided by machine learning algorithms, without any animal experiments. Synthetic peptides are immobilized in multiplex format on various platforms (e.g., 96-well microtiter plate, microarray). Afterwards, the viruses are incubated and analyzed comparing different signaling mechanisms and a variety of assay conditions. Differentiation of a range of influenza subtypes, including H1N1, H3N2, H5N1, as well as fine differentiation of single strains within these subtypes is possible using the peptide-based subtyping platform. Thereby, the platform could be capable of replacing the current antigenic characterization of influenza strains using ferret hyperimmune sera.

Keywords: antigenic characterization, influenza-binding peptides, influenza subtyping, influenza surveillance

Procedia PDF Downloads 137
13752 Novel Point of Care Test for Rapid Diagnosis of COVID-19 Using Recombinant Nanobodies against SARS-CoV-2 Spike1 (S1) Protein

Authors: Manal Kamel, Sara Maher, Hanan El Baz, Faten Salah, Omar Sayyouh, Zeinab Demerdash

Abstract:

In the recent COVID 19 pandemic, experts of public health have emphasized testing, tracking infected people, and tracing their contacts as an effective strategy to reduce the spread of the virus. Development of rapid and sensitive diagnostic assays to replace reverse transcription polymerase chain reaction (RT-PCR) is mandatory..Our innovative test strip relying on the application of nanoparticles conjugated to recombinant nanobodies for SARS-COV-2 spike protein (S1) & angiotensin-converting enzyme 2 (that is responsible for the virus entry into host cells) for rapid detection of SARS-COV-2 spike protein (S1) in saliva or sputum specimens. Comparative tests with RT-PCR will be held to estimate the significant effect of using COVID 19 nanobodies for the first time in the development of lateral flow test strip. The SARS-CoV-2 S1 (3 ng of recombinant proteins) was detected by our developed LFIA in saliva specimen of COVID-19 Patients No cross-reaction was detected with Middle East respiratory syndrome coronavirus (MERS-CoV) or SARS- CoV antigens..Our developed system revealed 96 % sensitivity and 100% specificity for saliva samples compared to 89 % and 100% sensitivity and specificity for nasopharyngeal swabs. providing a reliable alternative for the painful and uncomfortable nasopharyngeal swab process and the complexes, time consuming PCR test. An increase in testing compliances to be expected.

Keywords: COVID 19, diagnosis, LFIA, nanobodies, ACE2

Procedia PDF Downloads 116
13751 Simulation of Lean Principles Impact in a Multi-Product Supply Chain

Authors: Matteo Rossini, Alberto Portioli Staudacher

Abstract:

The market competition is moving from the single firm to the whole supply chain one because of increasing competition and growing need for operational efficiencies and customer orientation. Supply chain management allows companies to look beyond their organizational boundaries to develop and leverage resources and capabilities of their supply chain partners. This leads to create competitive advantages in the marketplace and because of this SCM has acquired strategic importance. Lean Approach is a management strategy that focuses on reducing every type of waste present in an organization. This approach is becoming more and more popular among supply chain managers. The supply chain application of lean approach is low diffused. It is not well studied which are the impacts of lean approach principles in a supply chain context. In literature there are only few studies simulating the lean approach performance in single products supply chain. This research work studies the impacts of lean principles implementation along a supply chain. To achieve this, a simulation model of a three-echelon multiproduct product supply chain has been built. Kanban system (and several priority policies) and setup time reduction degrees are implemented in the lean-configured supply chain to apply pull and lot-sizing decrease principles respectively. To evaluate the benefits of lean approach, lean supply chain is compared with an EOQ-configured supply chain. The simulation results show that Kanban system and setup-time reduction improve inventory stock level. They also show that logistics efforts are affected to lean implementation degree. The paper concludes describing performances of lean supply chain in different contexts.

Keywords: inventory policy, Kanban, lean supply chain, simulation study, supply chain management, planning

Procedia PDF Downloads 346
13750 Optimization of Gastro-Retentive Matrix Formulation and Its Gamma Scintigraphic Evaluation

Authors: Swapnila V. Shinde, Hemant P. Joshi, Sumit R. Dhas, Dhananjaysingh B. Rajput

Abstract:

The objective of the present study is to develop hydro-dynamically balanced system for atenolol, β-blocker as a single unit floating tablet. Atenolol shows pH dependent solubility resulting into a bioavailability of 36%. Thus, site specific oral controlled release floating drug delivery system was developed. Formulation includes novice use of rate controlling polymer such as locust bean gum (LBG) in combination of HPMC K4M and gas generating agent sodium bicarbonate. Tablet was prepared by direct compression method and evaluated for physico-mechanical properties. The statistical method was utilized to optimize the effect of independent variables, namely amount of HPMC K4M, LBG and three dependent responses such as cumulative drug release, floating lag time, floating time. Graphical and mathematical analysis of the results allowed the identification and quantification of the formulation variables influencing the selected responses. To study the gastrointestinal transit of the optimized gastro-retentive formulation, in vivo gamma scintigraphy was carried out in six healthy rabbits, after radio labeling the formulation with 99mTc. The transit profiles demonstrated that the dosage form was retained in the stomach for more than 5 hrs. The study signifies the potential of the developed system for stomach targeted delivery of atenolol with improved bioavailability.

Keywords: floating tablet, factorial design, gamma scintigraphy, antihypertensive model drug, HPMC, locust bean gum

Procedia PDF Downloads 268
13749 Mental Health Challenges, Internalizing and Externalizing Behavior Problems, and Academic Challenges among Adolescents from Broken Families

Authors: Fadzai Munyuki

Abstract:

Parental divorce is one of youth's most stressful life events and is associated with long-lasting emotional and behavioral problems. Over the last few decades, research has consistently found strong associations between divorce and adverse health effects in adolescents. Parental divorce has been hypothesized to lead to psychosocial development problems, mental health challenges, internalizing and externalizing behavior problems, and low academic performance among adolescents. This is supported by the Positive youth development theory, which states that a family setup has a major role to play in adolescent development and well-being. So, the focus of this research will be to test this hypothesized process model among adolescents in five provinces in Zimbabwe. A cross-sectional study will be conducted to test this hypothesis, and 1840 (n = 1840) adolescents aged between 14 to 17 will be employed for this study. A Stress and Questionnaire scale, a Child behavior checklist scale, and an academic concept scale will be used for this study. Data analysis will be done using Structural Equations Modeling. This study has many limitations, including the lack of a 'real-time' study, a few cross-sectional studies, a lack of a thorough and validated population measure, and many studies that have been done that have focused on one variable in relation to parental divorce. Therefore, this study seeks to bridge this gap between past research and current literature by using a validated population measure, a real-time study, and combining three latent variables in this study.

Keywords: mental health, internalizing and externalizing behavior, divorce, academic achievements

Procedia PDF Downloads 60
13748 Effect of Burdock Root Extract Concentration on Physiochemical Property of Coated Jasmine Rice by Using Top-Spay Fluidized Bed Coating Technique

Authors: Donludee Jaisut, Norihisa Kato, Thanutchaporn Kumrungsee, Kiyoshi Kawai, Somkiat Prachayawarakorn, Patchalee Tungtrakul

Abstract:

Jasmine Rice is a principle food of Thai people. However, glycemic index of jasmine rice is in high level, risk of type II diabetes after consuming. Burdock root is a good source of non-starch polysaccharides such as inulin. Inulin acts as prebiotic and helps reduce blood-sugar level. The purpose of this research was to reduce digestion rate of jasmine rice by coating burdock root extract on rice surface, using top-spay fluidized bed coating technique. Coating experiments were performed by spraying burdock root solution onto Jasmine rice kernels (Khao Dawk Mali-105; KDML), which had an initial moisture content of 11.6% wet basis, suspended in the fluidized bed. The experimental conditions were: solution spray rates of 31.7 mL/min, atomization pressure of 1.5 bar, spray time of 10 min, time of drying after spraying of 30 s, superficial air velocity of 3.2 m/s and drying temperatures of 60°C. The coated rice quality was evaluated in terms of the moisture content, texture, whiteness and digestion rate. The results showed that initial and final moisture contents of samples were the same in concentration 8% (v/v) and 10% (v/v). The texture was insignificantly changed from that of uncoated sample. The whiteness values were varied on concentration of burdock root extract. Coated samples were slower digested.

Keywords: burdock root, digestion, drying, rice

Procedia PDF Downloads 281
13747 An Exploratory Research of Human Character Analysis Based on Smart Watch Data: Distinguish the Drinking State from Normal State

Authors: Lu Zhao, Yanrong Kang, Lili Guo, Yuan Long, Guidong Xing

Abstract:

Smart watches, as a handy device with rich functionality, has become one of the most popular wearable devices all over the world. Among the various function, the most basic is health monitoring. The monitoring data can be provided as an effective evidence or a clue for the detection of crime cases. For instance, the step counting data can help to determine whether the watch wearer was quiet or moving during the given time period. There is, however, still quite few research on the analysis of human character based on these data. The purpose of this research is to analyze the health monitoring data to distinguish the drinking state from normal state. The analysis result may play a role in cases involving drinking, such as drunk driving. The experiment mainly focused on finding the figures of smart watch health monitoring data that change with drinking and figuring up the change scope. The chosen subjects are mostly in their 20s, each of whom had been wearing the same smart watch for a week. Each subject drank for several times during the week, and noted down the begin and end time point of the drinking. The researcher, then, extracted and analyzed the health monitoring data from the watch. According to the descriptive statistics analysis, it can be found that the heart rate change when drinking. The average heart rate is about 10% higher than normal, the coefficient of variation is less than about 30% of the normal state. Though more research is needed to be carried out, this experiment and analysis provide a thought of the application of the data from smart watches.

Keywords: character analysis, descriptive statistics analysis, drink state, heart rate, smart watch

Procedia PDF Downloads 155
13746 Robotic Arm-Automated Spray Painting with One-Shot Object Detection and Region-Based Path Optimization

Authors: Iqraq Kamal, Akmal Razif, Sivadas Chandra Sekaran, Ahmad Syazwan Hisaburi

Abstract:

Painting plays a crucial role in the aerospace manufacturing industry, serving both protective and cosmetic purposes for components. However, the traditional manual painting method is time-consuming and labor-intensive, posing challenges for the sector in achieving higher efficiency. Additionally, the current automated robot path planning has been a bottleneck for spray painting processes, as typical manual teaching methods are time-consuming, error-prone, and skill-dependent. Therefore, it is essential to develop automated tool path planning methods to replace manual ones, reducing costs and improving product quality. Focusing on flat panel painting in aerospace manufacturing, this study aims to address issues related to unreliable part identification techniques caused by the high-mixture, low-volume nature of the industry. The proposed solution involves using a spray gun and a UR10 robotic arm with a vision system that utilizes one-shot object detection (OS2D) to identify parts accurately. Additionally, the research optimizes path planning by concentrating on the region of interest—specifically, the identified part, rather than uniformly covering the entire painting tray.

Keywords: aerospace manufacturing, one-shot object detection, automated spray painting, vision-based path optimization, deep learning, automation, robotic arm

Procedia PDF Downloads 63
13745 The Methodology of Hand-Gesture Based Form Design in Digital Modeling

Authors: Sanghoon Shim, Jaehwan Jung, Sung-Ah Kim

Abstract:

As the digital technology develops, studies on the TUI (Tangible User Interface) that links the physical environment utilizing the human senses with the virtual environment through the computer are actively being conducted. In addition, there has been a tremendous advance in computer design making through the use of computer-aided design techniques, which enable optimized decision-making through comparison with machine learning and parallel comparison of alternatives. However, a complex design that can respond to user requirements or performance can emerge through the intuition of the designer, but it is difficult to actualize the emerged design by the designer's ability alone. Ancillary tools such as Gaudí's Sandbag can be an instrument to reinforce and evolve emerged ideas from designers. With the advent of many commercial tools that support 3D objects, designers' intentions are easily reflected in their designs, but the degree of their reflection reflects their intentions according to the proficiency of design tools. This study embodies the environment in which the form can be implemented by the fingers of the most basic designer in the initial design phase of the complex type building design. Leapmotion is used as a sensor to recognize the hand motions of the designer, and it is converted into digital information to realize an environment that can be linked in real time in virtual reality (VR). In addition, the implemented design can be linked with Rhino™, a 3D authoring tool, and its plug-in Grasshopper™ in real time. As a result, it is possible to design sensibly using TUI, and it can serve as a tool for assisting designer intuition.

Keywords: design environment, digital modeling, hand gesture, TUI, virtual reality

Procedia PDF Downloads 356
13744 Integration Process and Analytic Interface of different Environmental Open Data Sets with Java/Oracle and R

Authors: Pavel H. Llamocca, Victoria Lopez

Abstract:

The main objective of our work is the comparative analysis of environmental data from Open Data bases, belonging to different governments. This means that you have to integrate data from various different sources. Nowadays, many governments have the intention of publishing thousands of data sets for people and organizations to use them. In this way, the quantity of applications based on Open Data is increasing. However each government has its own procedures to publish its data, and it causes a variety of formats of data sets because there are no international standards to specify the formats of the data sets from Open Data bases. Due to this variety of formats, we must build a data integration process that is able to put together all kind of formats. There are some software tools developed in order to give support to the integration process, e.g. Data Tamer, Data Wrangler. The problem with these tools is that they need data scientist interaction to take part in the integration process as a final step. In our case we don’t want to depend on a data scientist, because environmental data are usually similar and these processes can be automated by programming. The main idea of our tool is to build Hadoop procedures adapted to data sources per each government in order to achieve an automated integration. Our work focus in environment data like temperature, energy consumption, air quality, solar radiation, speeds of wind, etc. Since 2 years, the government of Madrid is publishing its Open Data bases relative to environment indicators in real time. In the same way, other governments have published Open Data sets relative to the environment (like Andalucia or Bilbao). But all of those data sets have different formats and our solution is able to integrate all of them, furthermore it allows the user to make and visualize some analysis over the real-time data. Once the integration task is done, all the data from any government has the same format and the analysis process can be initiated in a computational better way. So the tool presented in this work has two goals: 1. Integration process; and 2. Graphic and analytic interface. As a first approach, the integration process was developed using Java and Oracle and the graphic and analytic interface with Java (jsp). However, in order to open our software tool, as second approach, we also developed an implementation with R language as mature open source technology. R is a really powerful open source programming language that allows us to process and analyze a huge amount of data with high performance. There are also some R libraries for the building of a graphic interface like shiny. A performance comparison between both implementations was made and no significant differences were found. In addition, our work provides with an Official Real-Time Integrated Data Set about Environment Data in Spain to any developer in order that they can build their own applications.

Keywords: open data, R language, data integration, environmental data

Procedia PDF Downloads 300
13743 System Identification of Timber Masonry Walls Using Shaking Table Test

Authors: Timir Baran Roy, Luis Guerreiro, Ashutosh Bagchi

Abstract:

Dynamic study is important in order to design, repair and rehabilitation of structures. It has played an important role in the behavior characterization of structures; such as bridges, dams, high-rise buildings etc. There had been a substantial development in this area over the last few decades, especially in the field of dynamic identification techniques of structural systems. Frequency Domain Decomposition (FDD) and Time Domain Decomposition are most commonly used methods to identify modal parameters; such as natural frequency, modal damping, and mode shape. The focus of the present research is to study the dynamic characteristics of typical timber masonry walls commonly used in Portugal. For that purpose, a multi-storey structural prototypes of such walls have been tested on a seismic shake table at the National Laboratory for Civil Engineering, Portugal (LNEC). Signal processing has been performed of the output response, which is collected from the shaking table experiment of the prototype using accelerometers. In the present work signal processing of the output response, based on the input response has been done in two ways: FDD and Stochastic Subspace Identification (SSI). In order to estimate the values of the modal parameters, algorithms for FDD are formulated, and parametric functions for the SSI are computed. Finally, estimated values from both the methods are compared to measure the accuracy of both the techniques.

Keywords: frequency domain decomposition (fdd), modal parameters, signal processing, stochastic subspace identification (ssi), time domain decomposition

Procedia PDF Downloads 253
13742 Stability Optimization of NABH₄ via PH and H₂O:NABH₄ Ratios for Large Scale Hydrogen Production

Authors: Parth Mehta, Vedasri Bai Khavala, Prabhu Rajagopal, Tiju Thomas

Abstract:

There is an increasing need for alternative clean fuels, and hydrogen (H₂) has long been considered a promising solution with a high calorific value (142MJ/kg). However, the storage of H₂ and expensive processes for its generation have hindered its usage. Sodium borohydride (NaBH₄) can potentially be used as an economically viable means of H₂ storage. Thus far, there have been attempts to optimize the life of NaBH₄ (half-life) in aqueous media by stabilizing it with sodium hydroxide (NaOH) for various pH values. Other reports have shown that H₂ yield and reaction kinetics remained constant for all ratios of H₂O to NaBH₄ > 30:1, without any acidic catalysts. Here we highlight the importance of pH and H₂O: NaBH₄ ratio (80:1, 40:1, 20:1 and 10:1 by weight), for NaBH₄ stabilization (half-life reaction time at room temperature) and corrosion minimization of H₂ reactor components. It is interesting to observe that at any particular pH>10 (e.g., pH = 10, 11 and 12), the H₂O: NaBH₄ ratio does not have the expected linear dependence with stability. On the contrary, high stability was observed at the ratio of 10:1 H₂O: NaBH₄ across all pH>10. When the H₂O: NaBH₄ ratio is increased from 10:1 to 20:1 and beyond (till 80:1), constant stability (% degradation) is observed with respect to time. For practical usage (consumption within 6 hours of making NaBH₄ solution), 15% degradation at pH 11 and NaBH₄: H₂O ratio of 10:1 is recommended. Increasing this ratio demands higher NaOH concentration at the same pH, thus requiring a higher concentration or volume of acid (e.g., HCl) for H₂ generation. The reactions are done with tap water to render the results useful from an industrial standpoint. The observed stability regimes are rationalized based on complexes associated with NaBH₄ when solvated in water, which depend sensitively on both pH and NaBH₄: H₂O ratio.

Keywords: hydrogen, sodium borohydride, stability optimization, H₂O:NaBH₄ ratio

Procedia PDF Downloads 103
13741 Team Teaching, Students Perception, Challenges, and Remedies for Effective Implementation: A Case Study of the Department of Biology, Alvan Ikoku Federal College of Education, Owerri Imo State, Nigeria

Authors: Daniel Ihemtuge Akim, Micheal O. Ikeanumba

Abstract:

This research focused on team teaching; students perception, challenges, and remedies for effective implementation, a case study of the department of Biology, Alvan Ikoku Federal College of Education, Owerri Imo State, Nigeria. It seeks to address the misconception by students on the use of team teaching as a methodology for learning. Five purposes and five research questions guided this study. Descriptive survey design was used in the study. The students of biology department enrolled in both Bachelor degree and National Certificate in Education in Alvan Ikoku Federal College of Education, Owerri, formed the population size. Simple random sampling technique was used to select the sampled students and 20% of whole lecturers were selected out of the whole given sample size of three hundred and forty (340). The instrument used for data collection was structured 4 point Likert scale questionnaire and analysis was made using mean method. The result revealed that poor time management by lectures, lack of lecture venues, manpower are some of the challenges hindering the effective implementation of team teaching. It was also observed that students perform better in academic when team teaching approach is used than single teaching approach. Finally, recommendations made suggested that teachers involved in team teaching should work together with their teaching strategies and within the time frame to achieve the stated objectives.

Keywords: challenges, implementation, perception, team teaching

Procedia PDF Downloads 368
13740 An Analysis of Gamification in the Post-Secondary Classroom

Authors: F. Saccucci

Abstract:

Gamification has now started to take root in the post-secondary classroom. Educators have learned much about gamification to date but there is still a great deal to learn. One definition of gamification is the ability to engage post-secondary students with games that are fun and correlate to class room curriculum. There is no shortage of literature illustrating the advantages of gamification in the class room. This study is an extension of similar thought as well as an extension of a previous study where in class testing proved with the used of paired T-test that gamification did significantly improve the students’ understanding of subject material. Gamification itself in the class room can range from high end computer simulated software to paper based games of which both have advantages and disadvantages. This analysis used a paper based game to highlight certain qualitative advantages of gamification. The paper based game in this analysis was inexpensive, required low preparation time for the faculty member and consumed approximately 20 minutes of class room time. Data for the study was collected through in class student feedback surveys and narrative from the faculty member moderating the game. Students were randomly selected into groups of four. Qualitative advantages identified in this analysis included: 1. Students had a chance to meet, connect and know other students. 2. Students enjoyed the gamification process given there was a sense of fun and competition. 3. The post assessment that followed the simulation game was not part of their grade calculation therefore it was an opportunity to participate in a low risk activity whereby students could subsequently self-assess their understanding of the subject material. 4. In the view of the student, content knowledge did increase after the gamification process. These qualitative advantages identified in this analysis contribute to the argument that there should be an attempt to use gamification in today’s post-secondary class room. The analysis also highlighted that eighty (80) percent of the respondents believe twenty minutes devoted to the gamification process was appropriate, however twenty (20) percentage of respondents believed that rather than scheduling a gamification process and its post quiz in the last week, a review for the final exam may have been more useful. An additional study to this hopes to determine if the scheduling of the gamification had any correlation to a percentage of the students not wanting to be engaged in the process. As well, the additional study hopes to determine at what incremental level of time invested in class room gamification produce no material incremental benefits to the student as well as determine if any correlation exist between respondents preferring not to have it at the end of the semester to students not believing the gamification process added to the increase of their curricular knowledge.

Keywords: gamification, inexpensive, non-quantitative advantages, post-secondary

Procedia PDF Downloads 197
13739 Detection of Acrylamide Using Liquid Chromatography-Tandem Mass Spectrometry and Quantitative Risk Assessment in Selected Food from Saudi Market

Authors: Sarah A. Alotaibi, Mohammed A. Almutairi, Abdullah A. Alsayari, Adibah M. Almutairi, Somaiah K. Almubayedh

Abstract:

Concerns over the presence of acrylamide in food date back to 2002, when Swedish scientists stated that, in carbohydrate-rich foods, amounts of acrylamide were formed when cooked at high temperatures. Similar findings were reported by other researchers which, consequently, caused major international efforts to investigate dietary exposure and the subsequent health complications in order to properly manage this issue. Due to this issue, in this work, we aim to determine the acrylamide level in different foods (coffee, potato chips, biscuits, and baby food) commonly consumed by the Saudi population. In a total of forty-three samples, acrylamide was detected in twenty-three samples at levels of 12.3 to 2850 µg/kg. In reference to the food groups, the highest concentration of acrylamide was found in coffee samples (<12.3-2850 μg/kg), followed by potato chips (655-1310 μg/kg), then biscuits (23.5-449 μg/kg), whereas the lowest acrylamide level was observed in baby food (<14.75 – 126 μg/kg). Most coffee, biscuits and potato chips products contain high amount of acrylamide content and also the most commonly consumed product. Saudi adults had a mean exposure of acrylamide for coffee, potato, biscuit, and cereal (0.07439, 0.04794, 0.01125, 0.003371 µg/kg-b.w/day), respectively. On the other hand, exposure to acrylamide in Saudi infants and children to the same types of food was (0.1701, 0.1096, 0.02572, 0.00771 µg/kg-b.w/day), respectively. Most groups have a percentile that exceeds the tolerable daily intake (TDI) cancer value (2.6 µg/kg-b.w/day). Overall, the MOE results show that the Saudi population is at high risk of acrylamide-related disease in all food types, and there is a chance of cancer risk in all age groups (all values ˂10,000). Furthermore, it was found that in non-cancer risks, the acrylamide in all tested foods was within the safe limit (˃125), except for potato chips, in which there is a risk for diseases in the population. With potato and coffee as raw materials, additional studies were conducted to assess different factors, including temperature, cocking time, and additives affecting the acrylamide formation in fried potato and roasted coffee, by systematically varying processing temperatures and time values, a mitigation of acrylamide content was achieved when lowering the temperature and decreasing the cooking time. Furthermore, it was shown that the combination of the addition of chitosan and NaCl had a large impact on the formation.

Keywords: risk assessment, dietary exposure, MOA, acrylamide, hazard

Procedia PDF Downloads 39
13738 A Surgical Correction and Innovative Splint for Swan Neck Deformity in Hypermobility Syndrome

Authors: Deepak Ganjiwale, Karthik Vishwanathan

Abstract:

Objective: Splinting is a great domain of occupational therapy profession.Making a splint for the patient would depend upon the need or requirement of the problems and deformities. Swan neck deformity is not very common in finger it may occur after any disease. Conservative treatment of the swan neck deformity is available by using different static splints only. There are very few reports of surgical correction of swan-neck deformity in benign hypermobility syndrome. Method: This case report describes the result of surgical intervention and hand splint in a twenty year old lady with past history of cardiovascular stroke with no residual neurological deficit. She presented with correctable swan neck deformity and failed to improve with static ring splints to correct the deformity. She was noted to have hyperlaxity (EhlerDanlos type) as per modified Beighton’s score of 5/9. She underwent volar plate plication of the proximal interphalangeal joint of the left ring finger along with hemitenodesis of ulnar slip of flexor digitorum superficialis (FDS) tendon whereby, the ulnar slip of FDS was passed through a small surgically created rent in A2 pulley and sutured back to itself. Result: Postoperatively, the patient was referred to occupational therapy for splinting with the instruction that the splint would work some time for as static and some time as dynamic for positional and correction of the finger. Conclusion: After occupational therapy intervention and splinting, the patient had a full correction of the swan-neck deformity with near full flexion of the operated finger and is able to work independently.

Keywords: swan neck, finger, deformity, splint, hypermobility

Procedia PDF Downloads 242
13737 Chaotic Sequence Noise Reduction and Chaotic Recognition Rate Improvement Based on Improved Local Geometric Projection

Authors: Rubin Dan, Xingcai Wang, Ziyang Chen

Abstract:

A chaotic time series noise reduction method based on the fusion of the local projection method, wavelet transform, and particle swarm algorithm (referred to as the LW-PSO method) is proposed to address the problem of false recognition due to noise in the recognition process of chaotic time series containing noise. The method first uses phase space reconstruction to recover the original dynamical system characteristics and removes the noise subspace by selecting the neighborhood radius; then it uses wavelet transform to remove D1-D3 high-frequency components to maximize the retention of signal information while least-squares optimization is performed by the particle swarm algorithm. The Lorenz system containing 30% Gaussian white noise is simulated and verified, and the phase space, SNR value, RMSE value, and K value of the 0-1 test method before and after noise reduction of the Schreiber method, local projection method, wavelet transform method, and LW-PSO method are compared and analyzed, which proves that the LW-PSO method has a better noise reduction effect compared with the other three common methods. The method is also applied to the classical system to evaluate the noise reduction effect of the four methods and the original system identification effect, which further verifies the superiority of the LW-PSO method. Finally, it is applied to the Chengdu rainfall chaotic sequence for research, and the results prove that the LW-PSO method can effectively reduce the noise and improve the chaos recognition rate.

Keywords: Schreiber noise reduction, wavelet transform, particle swarm optimization, 0-1 test method, chaotic sequence denoising

Procedia PDF Downloads 181
13736 Environmental Potential of Biochar from Wood Biomass Thermochemical Conversion

Authors: Cora Bulmău

Abstract:

Soil polluted with hydrocarbons spills is a major global concern today. As a response to this issue, our experimental study tries to put in evidence the option to choose for one environmentally friendly method: use of the biochar, despite to a classical procedure; incineration of contaminated soil. Biochar represents the solid product obtained through the pyrolysis of biomass, its additional use being as an additive intended to improve the quality of the soil. The positive effect of biochar addition to soil is represented by its capacity to adsorb and contain petroleum products within its pores. Taking into consideration the capacity of the biochar to interact with organic contaminants, the purpose of the present study was to experimentally establish the effects of the addition of wooden biomass-derived biochar on a soil contaminated with oil. So, the contaminated soil was amended with biochar (10%) produced by pyrolysis in different operational conditions of the thermochemical process. After 25 days, the concentration of petroleum hydrocarbons from soil treated with biochar was measured. An analytical method as Soxhlet extraction was adopted to estimate the concentrations of total petroleum products (TPH) in the soil samples: This technique was applied to contaminated soil, also to soils remediated by incineration/adding biochar. The treatment of soil using biochar obtained from pyrolysis of the Birchwood led to a considerable decrease in the concentrations of petroleum products. The incineration treatments conducted under experimental stage to clean up the same soil, contaminated with petroleum products, involved specific parameters: temperature of about 600°C, 800°C and 1000°C and treatment time 30 and 60 minutes. The experimental results revealed that the method using biochar has registered values of efficiency up to those of all incineration processes applied for the shortest time.

Keywords: biochar, biomass, remediaton, soil, TPH

Procedia PDF Downloads 217
13735 An Extensible Software Infrastructure for Computer Aided Custom Monitoring of Patients in Smart Homes

Authors: Ritwik Dutta, Marylin Wolf

Abstract:

This paper describes the trade-offs and the design from scratch of a self-contained, easy-to-use health dashboard software system that provides customizable data tracking for patients in smart homes. The system is made up of different software modules and comprises a front-end and a back-end component. Built with HTML, CSS, and JavaScript, the front-end allows adding users, logging into the system, selecting metrics, and specifying health goals. The back-end consists of a NoSQL Mongo database, a Python script, and a SimpleHTTPServer written in Python. The database stores user profiles and health data in JSON format. The Python script makes use of the PyMongo driver library to query the database and displays formatted data as a daily snapshot of user health metrics against target goals. Any number of standard and custom metrics can be added to the system, and corresponding health data can be fed automatically, via sensor APIs or manually, as text or picture data files. A real-time METAR request API permits correlating weather data with patient health, and an advanced query system is implemented to allow trend analysis of selected health metrics over custom time intervals. Available on the GitHub repository system, the project is free to use for academic purposes of learning and experimenting, or practical purposes by building on it.

Keywords: flask, Java, JavaScript, health monitoring, long-term care, Mongo, Python, smart home, software engineering, webserver

Procedia PDF Downloads 374
13734 Development and Validation of High-Performance Liquid Chromatography Method for the Determination and Pharmacokinetic Study of Linagliptin in Rat Plasma

Authors: Hoda Mahgoub, Abeer Hanafy

Abstract:

Linagliptin (LNG) belongs to dipeptidyl-peptidase-4 (DPP-4) inhibitor class. DPP-4 inhibitors represent a new therapeutic approach for the treatment of type 2 diabetes in adults. The aim of this work was to develop and validate an accurate and reproducible HPLC method for the determination of LNG with high sensitivity in rat plasma. The method involved separation of both LNG and pindolol (internal standard) at ambient temperature on a Zorbax Eclipse XDB C18 column and a mobile phase composed of 75% methanol: 25% formic acid 0.1% pH 4.1 at a flow rate of 1.0 mL.min-1. UV detection was performed at 254nm. The method was validated in compliance with ICH guidelines and found to be linear in the range of 5–1000ng.mL-1. The limit of quantification (LOQ) was found to be 5ng.mL-1 based on 100µL of plasma. The variations for intra- and inter-assay precision were less than 10%, and the accuracy values were ranged between 93.3% and 102.5%. The extraction recovery (R%) was more than 83%. The method involved a single extraction step of a very small plasma volume (100µL). The assay was successfully applied to an in-vivo pharmacokinetic study of LNG in rats that were administered a single oral dose of 10mg.kg-1 LNG. The maximum concentration (Cmax) was found to be 927.5 ± 23.9ng.mL-1. The area under the plasma concentration-time curve (AUC0-72) was 18285.02 ± 605.76h.ng.mL-1. In conclusion, the good accuracy and low LOQ of the bioanalytical HPLC method were suitable for monitoring the full pharmacokinetic profile of LNG in rats. The main advantages of the method were the sensitivity, small sample volume, single-step extraction procedure and the short time of analysis.

Keywords: HPLC, linagliptin, pharmacokinetic study, rat plasma

Procedia PDF Downloads 232
13733 Spontaneous Rupture of Splenic Artery Pseudoaneurysm; A Rare Presentation of Acute Abdominal Pain in the Emergency Department: Case Report

Authors: Zainab Elazab, Azhar Aziz

Abstract:

Background: Spontaneous Splenic artery pseudoaneurysm rupture is a rare condition which is potentially life threatening, if not detected and managed early. We report a case of abdominal pain with intraperitoneal free fluid, which turned out to be spontaneous rupture of a splenic artery pseudoaneurysm, and was treated with arterial embolization. Case presentation: A 28-year old, previously healthy male presented to the ED with a history of sudden onset upper abdominal pain and fainting attack. The patient denied any history of trauma or prior similar attacks. On examination, the patient had tachycardia and a low-normal BP (HR 110, BP 106/66) but his other vital signs were normal (Temp. 37.2, RR 18 and SpO2 100%). His abdomen was initially soft with mild tenderness in the upper region. Blood tests showed leukocytosis of 12.3 X109/L, Hb of 12.6 g/dl and lactic acid of 5.9 mmol/L. Ultrasound showed trace of free fluid in the perihepatic and perisplenic areas, and a splenic hypoechoic lesion. The patient remained stable; however, his abdomen became increasingly tender with guarding. We made a provisional diagnosis of a perforated viscus and the patient was started on IV fluids and IV antibiotics. An erect abdominal x-ray did not show any free air under the diaphragm so a CT abdomen was requested. Meanwhile, bedside ultrasound was repeated which showed increased amount of free fluid, suggesting intra-abdominal bleeding as the most probable etiology for the condition. His CT abdomen revealed a splenic injury with multiple lacerations, a focal intrasplenic enhancing area on venous phase scan (suggesting a pseudoaneurysm with associated splenic intraparenchymal, sub capsular and perisplenic hematomas). Free fluid in the subhepatic and intraperitoneal regions along the small bowel was also detected. Angiogram was done which confirmed a diagnosis of pseudoaneurysm of intrasplenic arterial branch, and angio-embolization was done to control the bleeding. The patient was later discharged in good condition with a surgery follow-up. Conclusion: Splenic artery pseudoaneurysm rupture is a rare cause of abdominal pain which should be considered in any case of abdominal pain with intraperitoneal bleeding. Early management is crucial as it carries a high mortality. Bedside ultrasound is a useful tool to help for early diagnosis of such cases.

Keywords: abdominal pain, pseudo aneurysm, rupture, splenic artery

Procedia PDF Downloads 301
13732 A Programming Assessment Software Artefact Enhanced with the Help of Learners

Authors: Romeo A. Botes, Imelda Smit

Abstract:

The demands of an ever changing and complex higher education environment, along with the profile of modern learners challenge current approaches to assessment and feedback. More learners enter the education system every year. The younger generation expects immediate feedback. At the same time, feedback should be meaningful. The assessment of practical activities in programming poses a particular problem, since both lecturers and learners in the information and computer science discipline acknowledge that paper-based assessment for programming subjects lacks meaningful real-life testing. At the same time, feedback lacks promptness, consistency, comprehensiveness and individualisation. Most of these aspects may be addressed by modern, technology-assisted assessment. The focus of this paper is the continuous development of an artefact that is used to assist the lecturer in the assessment and feedback of practical programming activities in a senior database programming class. The artefact was developed using three Design Science Research cycles. The first implementation allowed one programming activity submission per assessment intervention. This pilot provided valuable insight into the obstacles regarding the implementation of this type of assessment tool. A second implementation improved the initial version to allow multiple programming activity submissions per assessment. The focus of this version is on providing scaffold feedback to the learner – allowing improvement with each subsequent submission. It also has a built-in capability to provide the lecturer with information regarding the key problem areas of each assessment intervention.

Keywords: programming, computer-aided assessment, technology-assisted assessment, programming assessment software, design science research, mixed-method

Procedia PDF Downloads 284
13731 Uncovering Hidden Bugs: An Exploratory Approach

Authors: Sagar Jitendra Mahendrakar

Abstract:

Exploratory testing is a dynamic and adaptable method of software quality assurance that is frequently praised for its ability to find hidden flaws and improve the overall quality of the product. Instead of using preset test cases, exploratory testing allows testers to explore the software application dynamically. This is in contrast to scripted testing methodologies, which primarily rely on tester intuition, creativity, and adaptability. There are several tools and techniques that can aid testers in the exploratory testing process which we will be discussing in this talk.Tests of this kind are able to find bugs of this kind that are harder to find during structured testing or that other testing methods may have overlooked.The purpose of this abstract is to examine the nature and importance of exploratory testing in modern software development methods. It explores the fundamental ideas of exploratory testing, highlighting the value of domain knowledge and tester experience in spotting possible problems that may escape the notice of traditional testing methodologies. Throughout the software development lifecycle, exploratory testing promotes quick feedback loops and continuous improvement by giving testers the ability to make decisions in real time based on their observations. This abstract also clarifies the unique features of exploratory testing, like its non-linearity and capacity to replicate user behavior in real-world settings. Testers can find intricate bugs, usability problems, and edge cases in software through impromptu exploration that might go undetected. Exploratory testing's flexible and iterative structure fits in well with agile and DevOps processes, allowing for a quicker time to market without sacrificing the quality of the final product.

Keywords: exploratory, testing, automation, quality

Procedia PDF Downloads 24
13730 Influence of Low and Extreme Heat Fluxes on Thermal Degradation of Carbon Fibre-Reinforced Polymers

Authors: Johannes Bibinger, Sebastian Eibl, Hans-Joachim Gudladt

Abstract:

This study considers the influence of different irradiation scenarios on the thermal degradation of carbon fiber-reinforced polymers (CFRP). Real threats are simulated, such as fires with long-lasting low heat fluxes and nuclear heat flashes with short-lasting high heat fluxes. For this purpose, coated and uncoated quasi-isotropic samples of the commercially available CFRP HexPly® 8552/IM7 are thermally irradiated from one side by a cone calorimeter and a xenon short-arc lamp with heat fluxes between 5 and 175 W/cm² at varying time intervals. The specimen temperature is recorded on the front and backside as well as at different laminate depths. The CFRP is non-destructively tested with ultrasonic testing, infrared spectroscopy (ATR-FTIR), scanning electron microscopy (SEM), and micro-focused computed X-Ray tomography (μCT). Destructive tests are performed to evaluate the mechanical properties in terms of interlaminar shear strength (ILSS), compressive and tensile strength. The irradiation scenarios vary significantly in heat flux and exposure time. Thus, different heating rates, radiation effects, and temperature distributions occur. This leads to unequal decomposition processes, which affect the sensitivity of the strength type and damage behaviour of the specimens. However, with the use of surface coatings, thermal degradation of composite materials can be delayed.

Keywords: CFRP, one-sided thermal damage, high heat flux, heating rate, non-destructive and destructive testing

Procedia PDF Downloads 96
13729 The Mask of Motherhood a Changing Identity During the Transition to Motherhood

Authors: Geraldine Mc Loughlin, Mary Horgan, Rosaleen Murphy

Abstract:

Childbirth is a life-changing event, a psychological transition for the mother that must be viewed in a social context. Much has been written and documented regarding the preparation for birth and the immediate postnatal period, but the full psychological impact on the mother is not clear. One aspect of the transition process is Identity. Depending on a person’s worldview, the concept of identity is viewed differently; the nature of reality and how they construct knowledge influence these perspectives. Becoming a mother is not just an event but a process that time and experience will help to shape the understanding of the woman. To explore the emotional and psychological aspects of first-time mother’s experience during the transition to new motherhood. To identify factors affecting women’s identities in the period of 36 weeks gestation to 12 weeks postpartum. Interpretative Phenomenological Analysis (IPA) was used. It explores how these women make sense of and give meaning to their experiences. IPA is underpinned by 3 key principles: phenomenology, hermeneutics and idiographics. A purposeful sample of 10 women was recruited for this longitudinal study, to enable data to be collected during the transition to motherhood. Individual identity was interpreted and viewed as developing in response to changing contexts, such as the birth event becoming a parent, enabling one to construct one’s own sense of a meaningful life. Women effectively differentiated themselves from their personal and social identities and took responsibility for their actions. Identity is culturally and socially shaped and experienced, though not experienced similarly by all women. The individualized perspective on identity recognizes that (a) social influences are seen as external to the individual and (b) the view that social influences are, in fact, internalized by the individual.

Keywords: motherhood, transition, identity, IPA

Procedia PDF Downloads 44
13728 Dual-Actuated Vibration Isolation Technology for a Rotary System’s Position Control on a Vibrating Frame: Disturbance Rejection and Active Damping

Authors: Kamand Bagherian, Nariman Niknejad

Abstract:

A vibration isolation technology for precise position control of a rotary system powered by two permanent magnet DC (PMDC) motors is proposed, where this system is mounted on an oscillatory frame. To achieve vibration isolation for this system, active damping and disturbance rejection (ADDR) technology is presented which introduces a cooperation of a main and an auxiliary PMDC, controlled by discrete-time sliding mode control (DTSMC) based schemes. The controller of the main actuator tracks a desired position and the auxiliary actuator simultaneously isolates the induced vibration, as its controller follows a torque trend. To determine this torque trend, a combination of two algorithms is introduced by the ADDR technology. The first torque-trend producing algorithm rejects the disturbance by counteracting the perturbation, estimated using a model-based observer. The second torque trend applies active variable damping to minimize the oscillation of the output shaft. In this practice, the presented technology is implemented on a rotary system with a pendulum attached, mounted on a linear actuator simulating an oscillation-transmitting structure. In addition, the obtained results illustrate the functionality of the proposed technology.

Keywords: active damping, discrete-time nonlinear controller, disturbance tracking algorithm, oscillation transmitting support, position control, stability robustness, vibration isolation

Procedia PDF Downloads 91
13727 Applying (1, T) Ordering Policy in a Multi-Vendor-Single-Buyer Inventory System with Lost Sales and Poisson Demand

Authors: Adel Nikfarjam, Hamed Tayebi, Sadoullah Ebrahimnejad

Abstract:

This paper considers a two-echelon inventory system with a number of warehouses and a single retailer. The retailer replenishes its required items from warehouses, and assembles them into a single final product. We assume that each warehouse supplies only one kind of the raw material for the retailer. The demand process of the final product is assumed to be Poissson, and unsatisfied demand of the final product will be lost. The retailer applies one-for-one-period ordering policy which is also known as (1, T) ordering policy. In this policy the retailer orders to each warehouse a fixed quantity of each item at fixed time intervals, which the fixed quantity is equal to the utilization of the item in the final product. Since, this policy eliminates all demand uncertainties at the upstream echelon, the standard lot sizing model can be applied at all warehouses. In this paper, we calculate the total cost function of the inventory system. Then, based on this function, we present a procedure to obtain the optimal time interval between two consecutive order placements from retailer to the warehouses, and the optimal order quantities of warehouses (assuming that there are positive ordering costs at warehouses). Finally, we present some numerical examples, and conduct numerical sensitivity analysis for cost parameters.

Keywords: two-echelon supply chain, multi-vendor-single-buyer inventory system, lost sales, Poisson demand, one-for-one-period policy, lot sizing model

Procedia PDF Downloads 296
13726 Quantum Engine Proposal using Two-level Atom Like Manipulation and Relativistic Motoring Control

Authors: Montree Bunruangses, Sonath Bhattacharyya, Somchat Sonasang, Preecha Yupapin

Abstract:

A two-level system is manipulated by a microstrip add-drop circuit configured as an atom like system for wave-particle behavior investigation when its traveling speed along the circuit perimeter is the speed of light. The entangled pair formed by the upper and lower sideband peaks is bound by the angular displacement, which is given by 0≤θ≤π/2. The control signals associated with 3-peak signal frequencies are applied by the external inputs via the microstrip add-drop multiplexer ports, where they are time functions without the space term involved. When a system satisfies the speed of light conditions, the mass term has been changed to energy based on the relativistic limit described by the Lorentz factor and Einstein equation. The different applied frequencies can be utilized to form the 3-phase torques that can be applied for quantum engines. The experiment will use the two-level system circuit and be conducted in the laboratory. The 3-phase torques will be recorded and investigated for quantum engine driving purpose. The obtained results will be compared to the simulation. The optimum amplification of torque can be obtained by the resonant successive filtering operation. Torque will be vanished when the system is balanced at the stopped position, where |Time|=0, which is required to be a system stability condition. It will be discussed for future applications. A larger device may be tested in the future for realistic use. A synchronous and asynchronous driven motor is also discussed for the warp drive use.

Keywords: quantum engine, relativistic motor, 3-phase torque, atomic engine

Procedia PDF Downloads 46
13725 Iron Catalyst for Decomposition of Methane: Influence of Al/Si Ratio Support

Authors: A. S. Al-Fatesh, A. A. Ibrahim, A. M. AlSharekh, F. S. Alqahtani, S. O. Kasim, A. H. Fakeeha

Abstract:

Hydrogen is the expected future fuel since it produces energy without any pollution. It can be used as a fuel directly or through the fuel cell. It is also used in chemical and petrochemical industry as reducing agent or in hydrogenation processes. It is produced by different methods such as reforming of hydrocarbon, electrolytic method and methane decomposition. The objective of the present paper is to study the decomposition of methane reaction at 700°C and 800°C. The catalysts were prepared via impregnation method using 20%Fe and different proportions of combined alumina and silica support using the following ratios [100%, 90%, 80%, and 0% Al₂O₃/SiO₂]. The prepared catalysts were calcined and activated at 600 OC and 500 OC respectively. The reaction was carried out in fixed bed reactor at atmospheric pressure using 0.3g of catalyst and feed gas ratio of 1.5/1 CH₄/N₂ with a total flow rate 25 mL/min. Catalyst characterizations (TPR, TGA, BET, XRD, etc.) have been employed to study the behavior of catalysts before and after the reaction. Moreover, a brief description of the weight loss and the CH₄ conversions versus time on stream relating the different support ratios over 20%Fe/Al₂O₃/SiO₂ catalysts has been added as well. The results of TGA analysis provided higher weights losses for catalysts operated at 700°C than 800°C. For the 90% Al₂O₃/SiO₂, the activity decreases with the time on stream using 800°C reaction temperature from 73.9% initial CH₄ conversion to 46.3% for a period of 300min, whereas the activity for the same catalyst increases from 47.1% to 64.8% when 700°C reaction temperature is employed. Likewise, for 80% Al₂O₃/SiO₂ the trend of activity is similar to that of 90% Al₂O₃/SiO₂ but with a different rate of activity variation. It can be inferred from the activity results that the ratio of Al₂O₃ to SiO₂ is crucial and it is directly proportional with the activity. Whenever the Al/Si ratio decreases the activity declines. Indeed, the CH₄ conversion of 100% SiO₂ support was less than 5%.

Keywords: Al₂O₃, SiO₂, CH₄ decomposition, hydrogen, iron

Procedia PDF Downloads 168
13724 Virtual Team Performance: A Transactive Memory System Perspective

Authors: Belbaly Nassim

Abstract:

Virtual teams (VT) initiatives, in which teams are geographically dispersed and communicate via modern computer-driven technologies, have attracted increasing attention from researchers and professionals. The growing need to examine how to balance and optimize VT is particularly important given the exposure experienced by companies when their employees encounter globalization and decentralization pressures to monitor VT performance. Hence, organization is regularly limited due to misalignment between the behavioral capabilities of the team’s dispersed competences and knowledge capabilities and how trust issues interplay and influence these VT dimensions and the effects of such exchanges. In fact, the future success of business depends on the extent to which VTs are managing efficiently their dispersed expertise, skills and knowledge to stimulate VT creativity. Transactive memory system (TMS) may enhance VT creativity using its three dimensons: knowledge specialization, credibility and knowledge coordination. TMS can be understood as a composition of both a structural component residing of individual knowledge and a set of communication processes among individuals. The individual knowledge is shared while being retrieved, applied and the learning is coordinated. TMS is driven by the central concept that the system is built on the distinction between internal and external memory encoding. A VT learns something new and catalogs it in memory for future retrieval and use. TMS uses the role of information technology to explain VT behaviors by offering VT members the possibility to encode, store, and retrieve information. TMS considers the members of a team as a processing system in which the location of expertise both enhances knowledge coordination and builds trust among members over time. We build on TMS dimensions to hypothesize the effects of specialization, coordination, and credibility on VT creativity. In fact, VTs consist of dispersed expertise, skills and knowledge that can positively enhance coordination and collaboration. Ultimately, this team composition may lead to recognition of both who has expertise and where that expertise is located; over time, the team composition may also build trust among VT members over time developing the ability to coordinate their knowledge which can stimulate creativity. We also assess the reciprocal relationship between TMS dimensions and VT creativity. We wish to use TMS to provide researchers with a theoretically driven model that is empirically validated through survey evidence. We propose that TMS provides a new way to enhance and balance VT creativity. This study also provides researchers insight into the use of TMS to influence positively VT creativity. In addition to our research contributions, we provide several managerial insights into how TMS components can be used to increase performance within dispersed VTs.

Keywords: virtual team creativity, transactive memory systems, specialization, credibility, coordination

Procedia PDF Downloads 151