Search results for: retention time
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18295

Search results for: retention time

13765 Preoperative 3D Planning and Reconstruction of Mandibular Defects for Patients with Oral Cavity Tumors

Authors: Janis Zarins, Kristaps Blums, Oskars Radzins, Renars Deksnis, Atis Svare, Santa Salaka

Abstract:

Wide tumor resection remains the first choice method for tumors of the oral cavity. Nevertheless, remained tissue defect impacts patients functional and aesthetical outcome, which could be improved using microvascular tissue transfers. Mandibular reconstruction is challenging due to the complexity of composite tissue defects and occlusal relationships for normal eating, chewing, and pain free jaw motions. Individual 3-D virtual planning would provide better symmetry and functional outcome. The main goal of preoperative planning is to develop a customized surgical approach with patient specific cutting guides of the mandible, osteotomy guides of the fibula, pre-bended osteosynthesis plates to perform more precise reconstruction, to decrease the surgery time and reach the best outcome. Our study is based on the analysis of 32 patients operated on between 2019 to 2021. All patients underwent mandible reconstruction with vascularized fibula flaps. Patients characteristics, surgery profile, survival, functional outcome, and quality of life was evaluated. Preoperative planning provided a significant decrease of surgery time and the best arrangement of bone closely similar as before the surgery. In cases of bone asymmetry, deformity and malposition, a new mandible was created using 3D planning to restore the appearance of lower jaw anatomy and functionality.

Keywords: mandibular, 3D planning, cutting guides, fibula flap, reconstruction

Procedia PDF Downloads 109
13764 Gene Expression Profiling of Iron-Related Genes of Pasteurella multocida Serotype A Strain PMTB2.1

Authors: Shagufta Jabeen, Faez Jesse Firdaus Abdullah, Zunita Zakaria, Nurulfiza Mat Isa, Yung Chie Tan, Wai Yan Yee, Abdul Rahman Omar

Abstract:

Pasteurella multocida is associated with acute, as well as, chronic infections in avian and bovine such as pasteurellosis and hemorrhagic septicemia (HS) in cattle and buffaloes. Iron is one of the most important nutrients for pathogenic bacteria including Pasteurella and acts as a cofactor or prosthetic group in several essential enzymes and is needed for amino acid, pyrimidine, and DNA biosynthesis. In our recent study, we showed that 2% of Pasteurella multocida serotype A strain PMTB2.1 encode for iron regulating genes (Accession number CP007205.1). Genome sequencing of other Pasteurella multocida serotypes namely PM70 and HB01 also indicated up to 2.5% of the respective genome encode for iron regulating genes, suggesting that Pasteurella multocida genome comprises of multiple systems for iron uptake. Since P. multocida PMTB2.1 has more than 40 CDs out of 2097 CDs (approximately 2%), encode for iron-regulated. The gene expression profiling of four iron-regulating genes namely fbpb, yfea, fece and fur were characterized under iron-restricted environment. The P. multocida strain PMTB2.1 was grown in broth with and without iron chelating agent and samples were collected at different time points. Relative mRNA expression profile of these genes was determined using Taqman probe based real-time PCR assay. The data analysis, normalization with two house-keeping genes and the quantification of fold changes were carried out using Bio-Rad CFX manager software version 3.1. Results of this study reflect that iron reduced environment has significant effect on expression profile of iron regulating genes (p < 0.05) when compared to control (normal broth) and all evaluated genes act differently with response to iron reduction in media. The highest relative fold change of fece gene was observed at early stage of treatment indicating that PMTB2.1 may utilize its periplasmic protein at early stage to acquire iron. Furthermore, down-regulation expression of fece with the elevated expression of other genes at later time points suggests that PMTB2.1 control their iron requirements in response to iron availability by down-regulating the expression of iron proteins. Moreover, significantly high relative fold change (p ≤ 0.05) of fbpb gene is probably associated with the ability of P. multocida to directly use host iron complex such as hem, hemoglobin. In addition, the significant increase (p ≤ 0.05) in fbpb and yfea expressions also reflects the utilization of multiple iron systems in P. multocida strain PMTB2.1. The findings of this study are very much important as relative scarcity of free iron within hosts creates a major barrier to microbial growth inside host and utilization of outer-membrane proteins system in iron acquisition probably occurred at early stage of infection with P. multocida. In conclusion, the presence and utilization of multiple iron system in P. multocida strain PMTB2.1 revealed the importance of iron in the survival of P. multocida.

Keywords: iron-related genes, real-time PCR, gene expression profiling, fold changes

Procedia PDF Downloads 437
13763 Development of Latent Fingerprints on Non-Porous Surfaces Recovered from Fresh and Sea Water

Authors: A. Somaya Madkour, B. Abeer sheta, C. Fatma Badr El Dine, D. Yasser Elwakeel, E. Nermine AbdAllah

Abstract:

Criminal offenders have a fundamental goal not to leave any traces at the crime scene. Some may suppose that items recovered underwater will have no forensic value, therefore, they try to destroy the traces by throwing items in water. These traces are subjected to the destructive environmental effects. This can represent a challenge for Forensic experts investigating finger marks. Accordingly, the present study was conducted to determine the optimal method for latent fingerprints development on non-porous surfaces submerged in aquatic environments at different time interval. The two factors analyzed in this study were the nature of aquatic environment and length of submerged time. In addition, the quality of developed finger marks depending on the used method was also assessed. Therefore, latent fingerprints were deposited on metallic, plastic and glass objects and submerged in fresh or sea water for one, two, and ten days. After recovery, the items were subjected to cyanoacrylate fuming, black powder and small particle reagent processing and the prints were examined. Each print was evaluated according to fingerprint quality assessment scale. The present study demonstrated that the duration of submersion affects the quality of finger marks; the longer the duration, the worse the quality.The best results of visualization were achieved using cyanoacrylate either in fresh or sea water. This study has also revealed that the exposure to sea water had more destructive influence on the quality of detected finger marks.

Keywords: fingerprints, fresh water, sea, non-porous

Procedia PDF Downloads 436
13762 Artificial Intelligence in the Design of a Retaining Structure

Authors: Kelvin Lo

Abstract:

Nowadays, numerical modelling in geotechnical engineering is very common but sophisticated. Many advanced input settings and considerable computational efforts are required to optimize the design to reduce the construction cost. To optimize a design, it usually requires huge numerical models. If the optimization is conducted manually, there is a potentially dangerous consequence from human errors, and the time spent on the input and data extraction from output is significant. This paper presents an automation process introduced to numerical modelling (Plaxis 2D) of a trench excavation supported by a secant-pile retaining structure for a top-down tunnel project. Python code is adopted to control the process, and numerical modelling is conducted automatically in every 20m chainage along the 200m tunnel, with maximum retained height occurring in the middle chainage. Python code continuously changes the geological stratum and excavation depth under groundwater flow conditions in each 20m section. It automatically conducts trial and error to determine the required pile length and the use of props to achieve the required factor of safety and target displacement. Once the bending moment of the pile exceeds its capacity, it will increase in size. When the pile embedment reaches the default maximum length, it will turn on the prop system. Results showed that it saves time, increases efficiency, lowers design costs, and replaces human labor to minimize error.

Keywords: automation, numerical modelling, Python, retaining structures

Procedia PDF Downloads 38
13761 Transformative Measures in Chemical and Petrochemical Industry Through Agile Principles and Industry 4.0 Technologies

Authors: Bahman Ghorashi

Abstract:

The immense awareness of the global climate change has compelled traditional fossil fuel companies to develop strategies to reduce their carbon footprint and simultaneously consider the production of various sources of clean energy in order to mitigate the environmental impact of their operations. Similarly, supply chain issues, the scarcity of certain raw materials, energy costs as well as market needs, and changing consumer expectations have forced the traditional chemical industry to reexamine their time-honored modes of operation. This study examines how such transformative change might occur through the applications of agile principles as well as industry 4.0 technologies. Clearly, such a transformation is complex, costly, and requires a total commitment on the part of the top leadership and the entire management structure. Factors that need to be considered include organizational speed of change, a restructuring that would lend itself toward collaboration and the selling of solutions to customers’ problems, rather than just products, integrating ‘along’ as well as ‘across’ value chains, mastering change and uncertainty as well as a recognition of the importance of concept-to-cash time, i.e., the velocity of introducing new products to market, and the leveraging of people and information. At the same time, parallel to implementing such major shifts in the ethos, and the fabric of the organization, the change leaders should remain mindful of the companies’ DNA while incorporating the necessary DNA defying shifts. Furthermore, such strategic maneuvers should inevitably incorporate the managing of the upstream and downstream operations, harnessing future opportunities, preparing and training the workforce, implementing faster decision making and quick adaptation to change, managing accelerated response times, as well as forming autonomous and cross-functional teams. Moreover, the leaders should establish the balance between high-value solutions versus high-margin products, fully implement digitization of operations and, when appropriate, incorporate the latest relevant technologies, such as: AI, IIoT, ML, and immersive technologies. This study presents a summary of the agile principles and the relevant technologies and draws lessons from some of the best practices that are already implemented within the chemical industry in order to establish a roadmap to agility. Finally, the critical role of educational institutions in preparing the future workforce for Industry 4.0 is addressed.

Keywords: agile principles, immersive technologies, industry 4.0, workforce preparation

Procedia PDF Downloads 92
13760 Planning and Urban Climate Change Adaptation: Italian Literature Review

Authors: Mara Balestrieri

Abstract:

Climate change has long been the focus of attention for the growing impact of extreme weather events and global warming in many areas of the planet and the evidence of economic, social, and environmental damage caused by global warming. Nowadays, climate change is recognized as a critical global problem. Several initiatives have been undertaken over time to enhance the long theoretical debate and field experience in order to reduce Co2 emissions and contain climate alteration. However, the awareness that climate change is already taking place has led to a growing demand for adaptation. It is certainly a matter of anticipating the negative effects of climate change but, at the same time, implementing appropriate actions to prevent climate change-related damage, minimize the problems that may result, and also seize any opportunities that may arise. Consequently, adaptation has become a core element of climate policy and research. However, the attention to this issue has not developed in a uniform manner across countries. Some countries are further ahead than others. This paper examines the literature on climate change adaptation developed until 2018 in Italy, considering the urban dimension, to provide a framework for it, and to identify main topics and features. The papers were selected from Scopus and were analyzed through a matrix that we propose. Results demonstrate that adaptation to climate change studies attracted increasing attention from Italian scientific communities in the last years, although Italian scientific production is still quantitatively lower than in other countries and describes strengths and weaknesses in line with international panorama with respect to objectives, sectors, and problems.

Keywords: adaptation, bibliometric literature, climate change, urban studies

Procedia PDF Downloads 57
13759 Transferring Data from Glucometer to Mobile Device via Bluetooth with Arduino Technology

Authors: Tolga Hayit, Ucman Ergun, Ugur Fidan

Abstract:

Being healthy is undoubtedly an indispensable necessity for human life. With technological improvements, in the literature, various health monitoring and imaging systems have been developed to satisfy your health needs. In this context, the work of monitoring and recording the data of individual health monitoring data via wireless technology is also being part of these studies. Nowadays, mobile devices which are located in almost every house and which become indispensable of our life and have wireless technology infrastructure have an important place of making follow-up health everywhere and every time because these devices were using in the health monitoring systems. In this study, Arduino an open-source microcontroller card was used in which a sample sugar measuring device was connected in series. In this way, the glucose data (glucose ratio, time) obtained with the glucometer is transferred to the mobile device based on the Android operating system with the Bluetooth technology channel. A mobile application was developed using the Apache Cordova framework for listing data, presenting graphically and reading data over Arduino. Apache Cordova, HTML, Javascript and CSS are used in coding section. The data received from the glucometer is stored in the local database of the mobile device. It is intended that people can transfer their measurements to their mobile device by using wireless technology and access the graphical representations of their data. In this context, the aim of the study is to be able to perform health monitoring by using different wireless technologies in mobile devices that can respond to different wireless technologies at present. Thus, that will contribute the other works done in this area.

Keywords: Arduino, Bluetooth, glucose measurement, mobile health monitoring

Procedia PDF Downloads 300
13758 Postoperative Radiotherapy in Cancers of the Larynx: Experience of the Emir Abdelkader Cancer Center of Oran, about 89 Cases

Authors: Taleb Lotfi, Benarbia Maheidine, Allam Hamza, Boutira Fatima, Boukerche Abdelbaki

Abstract:

Introduction and purpose of the study: This is a retrospective single-center study with an analytical aim to determine the prognostic factors for relapse in patients treated with radiotherapy after total laryngectomy with lymph node dissection for laryngeal cancer at the Emir Abdelkader cancer center in Oran (Algeria). Material and methods: During the study period from January 2014 to December 2018, eighty-nine patients (n=89) with squamous cell carcinoma of the larynx were treated with postoperative radiotherapy. Relapse-free survival was studied in the univariate analysis according to pre-treatment criteria using Kaplan-Meier survival curves. We performed a univariate analysis to identify relapse factors. Statistically significant factors have been studied in the multifactorial analysis according to the Cox model. Results and statistical analysis: The average age was 62.7 years (40-86 years). It was a squamous cell carcinoma in all cases. Postoperatively, the tumor was classified as pT3 and pT4 in 93.3% of patients. Histological lymph node involvement was found in 36 cases (40.4%), with capsule rupture in 39% of cases, while the limits of surgical excision were microscopically infiltrated in 11 patients (12.3%). Chemotherapy concomitant with radiotherapy was used in 67.4% of patients. With a median follow-up of 57 months (23 to 104 months), the probabilities of relapse-free survival and five-year overall survival are 71.2% and 72.4%, respectively. The factors correlated with a high risk of relapse were locally advanced tumor stage pT4 (p=0.001), tumor site in case of subglottic extension (p=0.0003), infiltrated surgical limits R1 (p=0.001), l lymph node involvement (p=0.002), particularly in the event of lymph node capsular rupture (p=0.0003) as well as the time between surgery and adjuvant radiotherapy (p=0.001). However, in the subgroup analysis, the major prognostic factors for disease-free survival were subglottic tumor extension (p=0.001) and time from surgery to adjuvant radiotherapy (p=0.005). Conclusion: Combined surgery and postoperative radiation therapy are effective treatment modalities in the management of laryngeal cancer. Close cooperation of the entire cervicofacial oncology team is essential, expressed during a multidisciplinary consultation meeting, with the need to respect the time between surgery and radiotherapy.

Keywords: laryngeal cancer, laryngectomy, postoperative radiotherapy, survival

Procedia PDF Downloads 86
13757 Analysis of Accessibility of Tourism Transportation in Banyuwangi

Authors: Lilla Anjani, Ervina Ahyudanari

Abstract:

Tourism is one of the contributors to regional economic income. Banyuwangi has made rapid developments related to the tourism sector, especially since 2010. There are 25 tourist visit locations that can become tourist destinations. Banyuwangi has tourism transportation to support the ease of reaching tourist places. This transportation operates with six routes, namely the final destination of Ijen Crater, Glenmore, Bajangan, Bangsring, Red Island, and Pine Forest. Despite having tourism transportation, tourists tend to choose to use a private car or rent a car because there is no access to tourist places using public transportation. Tourism transportation is also one form of sustainable tourism development in the future, such as the Sustainable Development Goals. The Banyuwangi government has a special program for tourism development that is supported by all sectors in Banyuwangi. To support the development of tourism in Banyuwangi, it is necessary to analyze existing tourism transportation as well as suggestions regarding new routes to reach all tourism locations in Banyuwangi Regency. The analysis reviewed in this study is an analysis of accessibility, distance, and time to the tourism location. There are 30 tourism destination points from 39 ODTW references from the transportation service, and the tourism office of Banyuwangi Regency Banyuwangi tourism objects can be divided into six zones based on travel time and distance. The highest accessibility value for Zone A is 51.96, and the lowest is 11.989. The highest accessibility value for Zone B is 33.4269, and the lowest is 21.737. The highest accessibility value for Zone C is 33,407, and the lowest is 14,848. The highest accessibility value for Zone D is 58,967, and the lowest is 14,742. The highest accessibility value for Zone E is 56,401, and the lowest is 14.1. The highest accessibility value for Zone F is 176.14, and the lowest is 44.1. There are two tourist transportation routes with six sessions every day. The resulting new route is in the form of grouping based on locations that can be reached in one particular area.

Keywords: accessibility, tourism clustering, Banyuwangi tourism, sustainable development goals

Procedia PDF Downloads 67
13756 Early Education Assessment Methods

Authors: Anantdeep Kaur, Sharanjeet Singh

Abstract:

Early childhood education and assessment of children is a very essential tool that helps them in their growth and development. Techniques should be developed, and tools should be created in this field as it is a very important learning phase of life. Some information and sources are included for student assessment to provide a record of growth in all developmental areas cognitive, physical, Language, social-emotional, and approaches to learning. As an early childhood educator, it is very important to identify children who need special support and counseling to improve them because they are not mentally mature to discuss with the teacher their problems and needs. It is the duty and responsibility of the educator to assess children from their body language, behavior, and their routine actions about their skills that can be improved and which can take them forward in their future life. And also, children should be assessed with their weaker points because this is the right time to correct them, and they be improved with certain methods and tools by working on them constantly. Observing children regularly with all their facets of development, including intellectual, linguistic, social-emotional, and physical development. Every day, a physical education class should be regulated to check their physical growth activities, which can help to assess their physical activeness and motor abilities. When they are outside on the playgrounds, it is very important to instill environmental understanding among them so that they should know that they are very part of this nature, and it will help them to be one with the universe rather than feeling themselves individually. This technique assists them in living their childhood full of energy all the time. All types of assessments have unique purposes. It is important first to determine what should be measured, then find the program that best assesses those.

Keywords: special needs, motor ability, environmental understanding, physical development

Procedia PDF Downloads 81
13755 Assessment of Tidal Influence in Spatial and Temporal Variations of Water Quality in Masan Bay, Korea

Authors: S. J. Kim, Y. J. Yoo

Abstract:

Slack-tide sampling was carried out at seven stations at high and low tides for a tidal cycle, in summer (7, 8, 9) and fall (10), 2016 to determine the differences of water quality according to tides in Masan Bay. The data were analyzed by Pearson correlation and factor analysis. The mixing state of all the water quality components investigated is well explained by the correlation with salinity (SAL). Turbidity (TURB), dissolved silica (DSi), nitrite and nitrate nitrogen (NNN) and total nitrogen (TN), which find their way into the bay from the streams and have no internal source and sink reaction, showed a strong negative correlation with SAL at low tide, indicating the property of conservative mixing. On the contrary, in summer and fall, dissolved oxygen (DO), hydrogen sulfide (H2S) and chemical oxygen demand with KMnO4 (CODMn) of the surface and bottom water, which were sensitive to an internal source and sink reaction, showed no significant correlation with SAL at high and low tides. The remaining water quality parameters showed a conservative or a non-conservative mixing pattern depending on the mixing characteristics at high and low tides, determined by the functional relationship between the changes of the flushing time and the changes of the characteristics of water quality components of the end-members in the bay. Factor analysis performed on the concentration difference data sets between high and low tides helped in identifying the principal latent variables for them. The concentration differences varied spatially and temporally. Principal factors (PFs) scores plots for each monitoring situation showed high associations of the variations to the monitoring sites. At sampling station 1 (ST1), temperature (TEMP), SAL, DSi, TURB, NNN and TN of the surface water in summer, TEMP, SAL, DSi, DO, TURB, NNN, TN, reactive soluble phosphorus (RSP) and total phosphorus (TP) of the bottom water in summer, TEMP, pH, SAL, DSi, DO, TURB, CODMn, particulate organic carbon (POC), ammonia nitrogen (AMN), NNN, TN and fecal coliform (FC) of the surface water in fall, TEMP, pH, SAL, DSi, H2S, TURB, CODMn, AMN, NNN and TN of the bottom water in fall commonly showed up as the most significant parameters and the large concentration differences between high and low tides. At other stations, the significant parameters showed differently according to the spatial and temporal variations of mixing pattern in the bay. In fact, there is no estuary that always maintains steady-state flow conditions. The mixing regime of an estuary might be changed at any time from linear to non-linear, due to the change of flushing time according to the combination of hydrogeometric properties, inflow of freshwater and tidal action, And furthermore the change of end-member conditions due to the internal sinks and sources makes the occurrence of concentration difference inevitable. Therefore, when investigating the water quality of the estuary, it is necessary to take a sampling method considering the tide to obtain average water quality data.

Keywords: conservative mixing, end-member, factor analysis, flushing time, high and low tide, latent variables, non-conservative mixing, slack-tide sampling, spatial and temporal variations, surface and bottom water

Procedia PDF Downloads 108
13754 Indian Business-Papers in Industrial Revolution 4.0: A Paradigm Shift

Authors: Disha Batra

Abstract:

The Industrial Revolution 4.0 is quite different, and a paradigm shift is underway in the media industry. With the advent of automated journalism and social media platforms, newspaper organizations have changed the way news was gathered and reported. The emergence of the fourth industrial revolution in the early 21st century has made the newspapers to adapt the changing technologies to remain relevant. This paper investigates the content of Indian business-papers in the era of the fourth industrial revolution and how these organizations have emerged in the time of convergence. The study is the content analyses of the top three Indian business dailies as per IRS (Indian Readership Survey) 2017 over a decade. The parametric analysis of the different parameters (source of information, use of illustrations, advertisements, layout, and framing, etc.) have been done in order to come across with the distinct adaptations and modifications by these dailies. The paper significantly dwells upon the thematic analysis of these newspapers in order to explore and find out the coverage given to various sub-themes of EBF (economic, business, and financial) journalism. Further, this study reveals the effect of high-speed algorithm-based trading, the aftermath of the fourth industrial revolution on the creative and investigative aspect of delivering financial stories by these respective newspapers. The study indicates a change heading towards an ongoing paradigm shift in the business newspaper industry with an adequate change in the source of information gathering along with the subtle increase in the coverage of financial news stories over the time.

Keywords: business-papers, business news, financial news, industrial revolution 4.0.

Procedia PDF Downloads 100
13753 A Prediction Model Using the Price Cyclicality Function Optimized for Algorithmic Trading in Financial Market

Authors: Cristian Păuna

Abstract:

After the widespread release of electronic trading, automated trading systems have become a significant part of the business intelligence system of any modern financial investment company. An important part of the trades is made completely automatically today by computers using mathematical algorithms. The trading decisions are taken almost instantly by logical models and the orders are sent by low-latency automatic systems. This paper will present a real-time price prediction methodology designed especially for algorithmic trading. Based on the price cyclicality function, the methodology revealed will generate price cyclicality bands to predict the optimal levels for the entries and exits. In order to automate the trading decisions, the cyclicality bands will generate automated trading signals. We have found that the model can be used with good results to predict the changes in market behavior. Using these predictions, the model can automatically adapt the trading signals in real-time to maximize the trading results. The paper will reveal the methodology to optimize and implement this model in automated trading systems. After tests, it is proved that this methodology can be applied with good efficiency in different timeframes. Real trading results will be also displayed and analyzed in order to qualify the methodology and to compare it with other models. As a conclusion, it was found that the price prediction model using the price cyclicality function is a reliable trading methodology for algorithmic trading in the financial market.

Keywords: algorithmic trading, automated trading systems, financial markets, high-frequency trading, price prediction

Procedia PDF Downloads 168
13752 Preparation of Pegylated Interferon Alpha-2b with High Antiviral Activity Using Linear 20 KDa Polyethylene Glycol Derivative

Authors: Ehab El-Dabaa, Omnia Ali, Mohamed Abd El-Hady, Ahmed Osman

Abstract:

Recombinant human interferon alpha 2 (rhIFN-α2) is FDA approved for treatment of some viral and malignant diseases. Approved pegylated rhIFN-α2 drugs have highly improved pharmacokinetics, pharmacodynamics and therapeutic efficiency compared to native protein. In this work, we studied the pegylation of purified properly refolded rhIFN-α2b using linear 20kDa PEG-NHS (polyethylene glycol- N-hydroxysuccinimidyl ester) to prepare pegylated rhIFN-α2b with high stability and activity. The effect of different parameters like rhIFN-α2b final concentration, pH, rhIFN-α2b/PEG molar ratios and reaction time on the efficiency of pegylation (high percentage of monopegylated rhIFN-α2b) have been studied in small scale (100µl) pegylation reaction trials. Study of the percentages of different components of these reactions (mono, di, polypegylated rhIFN-α2b and unpegylated rhIFN-α2b) indicated that 2h is optimum time to complete the reaction. The pegylation efficiency increased at pH 8 (57.9%) by reducing the protein concentration to 1mg/ml and reducing the rhIFN-α2b/PEG ratio to 1:2. Using larger scale pegylation reaction (65% pegylation efficiency), ion exchange chromatography method has been optimized to prepare and purify the monopegylated rhIFN-α2b with high purity (96%). The prepared monopegylated rhIFN-α2b had apparent Mwt of approximately 65 kDa and high in vitro antiviral activity (2.1x10⁷ ± 0.8 x10⁷ IU/mg). Although it retained approximately 8.4 % of the antiviral activity of the unpegylated rhIFN-α2b, its activity is high compared to other pegylated rhIFN-α2 developed by using similar approach or higher molecular weight branched PEG.

Keywords: antiviral activity, rhIFN-α2b, pegylation, pegylation efficiency

Procedia PDF Downloads 159
13751 Non-Invasive Techniques for Management of Carious Primary Dentition Using Silver Diamine Fluoride and Moringa Extract as a Modification of the Hall Technique

Authors: Rasha F. Sharaf

Abstract:

Treatment of dental caries in young children is considered a great challenge for all dentists, especially with uncooperative children. Recently non-invasive techniques have been highlighted as they alleviate the need for local anesthesia and other painful procedures during management of carious teeth and, at the same time, increase the success rate of the treatment done. Silver Diamine Fluoride (SDF) is one of the most effective cariostatic materials that arrest the progression of carious lesions and aid in remineralizing the demineralized tooth structure. Both fluoride and silver ions proved to have an antibacterial action and aid in the precipitation of an insoluble layer that prevents further decay. At the same time, Moringa proved to have an effective antibacterial action against different types of bacteria, therefore, it can be used as a non-invasive technique for the management of caries in children. One of the important theories for the control of caries is by depriving the cariogenic bacteria from nutrients causing their starvation and death, which can be achieved by applying stainless steel crown on primary molars with carious lesions which are not involving the pulp, and this technique is known as Hall technique. The success rate of the Hall technique can be increased by arresting the carious lesion using either SDF or Moringa and gaining the benefit of their antibacterial action. Multiple clinical cases with 1 year follow up will be presented, comparing different treatment options, and using various materials and techniques for non-invasive and non-painful management of carious primary teeth.

Keywords: SDF, hall technique, carious primary teeth, moringa extract

Procedia PDF Downloads 81
13750 Food Safety and Quality Assurance and Skills Development among Farmers in Georgia

Authors: Kakha Nadiardze, Nana Phirosmanashvili

Abstract:

The goal of this paper is to present the problems of lack of information among farmers in food safety. Global food supply chains are becoming more and more diverse, making traceability systems much harder to implement across different food markets. In this abstract, we will present our work for analyzing the key developments in Georgian food market from regulatory controls to administrative procedures to traceability technologies. Food safety and quality assurance are most problematic issues in Georgia as food trade networks become more and more complex, food businesses are under more and more pressure to ensure that their products are safe and authentic. The theme follow-up principles from farm to table must be top-of-mind for all food manufacturers, farmers and retailers. Following the E. coli breakout last year, as well as more recent cases of food mislabeling, developments in food traceability systems is essential to food businesses if they are to present a credible brand image. Alongside this are the ever-developing technologies in food traceability networks, technologies that manufacturers and retailers need to be aware of if they are to keep up with food safety regulations and avoid recall. How to examine best practice in food management is the main question in order to protect company brand through safe and authenticated food. We are working with our farmers to work with our food safety experts and technology developers throughout the food supply chain. We provide time by time food analyses on heavy metals, pesticide residues and different pollutants. We are disseminating information among farmers how the latest food safety regulations will impact the methods to use to identify risks within their products.

Keywords: food safety, GMO, LMO, E. coli, quality

Procedia PDF Downloads 485
13749 Parent’s Evaluation of the Services Offered to Their Children with Autism in UAE Centres

Authors: Mohammad Ali Fteiha, Ghanem Al Bustami

Abstract:

The study aimed to identify the assessment of parents of children with Autism for services provided by the Center for special care in the United Arab Emirates, in terms of quality, comprehensive and the impact of some factors related to the diagnosis and place of service provision and efficient working procedures of service and the child age. In order to achieve the objective of the study, researchers used Parent’s Satisfaction Scale, and Parents Evaluation of Services Effectiveness, both the scale and the parents reports provided with accepted level of validity and reliability. Sample includes 300 families of children with Autism receiving educational and rehabilitation services, treatment and support services in both governmental and private centers in United Arab Emirates. ANOVA test was used through SPSS program to analyze the collected data. The results of the study have indicated that there are significant differences in the assessment of services provided by centers due to a place of service, the nature of the diagnosis, child's age at the time of the study, as well as statistically significance differences due to age when first diagnosed. The results also showed positive evaluation for the good level of services as international standard, and the quality of these services provided by autism centers in the United Arab Emirates, especially in governmental centers. At the same time, the results showed the presence of many needs problems faced by the parents do not have appropriate solutions. Based on the results the recommendations were stated.

Keywords: autism, evaluation, diagnosis, parents, autism programs, supportive services, government centers, private centers

Procedia PDF Downloads 544
13748 Comprehensive Feature Extraction for Optimized Condition Assessment of Fuel Pumps

Authors: Ugochukwu Ejike Akpudo, Jank-Wook Hur

Abstract:

The increasing demand for improved productivity, maintainability, and reliability has prompted rapidly increasing research studies on the emerging condition-based maintenance concept- Prognostics and health management (PHM). Varieties of fuel pumps serve critical functions in several hydraulic systems; hence, their failure can have daunting effects on productivity, safety, etc. The need for condition monitoring and assessment of these pumps cannot be overemphasized, and this has led to the uproar in research studies on standard feature extraction techniques for optimized condition assessment of fuel pumps. By extracting time-based, frequency-based and the more robust time-frequency based features from these vibrational signals, a more comprehensive feature assessment (and selection) can be achieved for a more accurate and reliable condition assessment of these pumps. With the aid of emerging deep classification and regression algorithms like the locally linear embedding (LLE), we propose a method for comprehensive condition assessment of electromagnetic fuel pumps (EMFPs). Results show that the LLE as a comprehensive feature extraction technique yields better feature fusion/dimensionality reduction results for condition assessment of EMFPs against the use of single features. Also, unlike other feature fusion techniques, its capabilities as a fault classification technique were explored, and the results show an acceptable accuracy level using standard performance metrics for evaluation.

Keywords: electromagnetic fuel pumps, comprehensive feature extraction, condition assessment, locally linear embedding, feature fusion

Procedia PDF Downloads 100
13747 A Picture is worth a Billion Bits: Real-Time Image Reconstruction from Dense Binary Pixels

Authors: Tal Remez, Or Litany, Alex Bronstein

Abstract:

The pursuit of smaller pixel sizes at ever increasing resolution in digital image sensors is mainly driven by the stringent price and form-factor requirements of sensors and optics in the cellular phone market. Recently, Eric Fossum proposed a novel concept of an image sensor with dense sub-diffraction limit one-bit pixels (jots), which can be considered a digital emulation of silver halide photographic film. This idea has been recently embodied as the EPFL Gigavision camera. A major bottleneck in the design of such sensors is the image reconstruction process, producing a continuous high dynamic range image from oversampled binary measurements. The extreme quantization of the Poisson statistics is incompatible with the assumptions of most standard image processing and enhancement frameworks. The recently proposed maximum-likelihood (ML) approach addresses this difficulty, but suffers from image artifacts and has impractically high computational complexity. In this work, we study a variant of a sensor with binary threshold pixels and propose a reconstruction algorithm combining an ML data fitting term with a sparse synthesis prior. We also show an efficient hardware-friendly real-time approximation of this inverse operator. Promising results are shown on synthetic data as well as on HDR data emulated using multiple exposures of a regular CMOS sensor.

Keywords: binary pixels, maximum likelihood, neural networks, sparse coding

Procedia PDF Downloads 182
13746 Seismic Performance of Concrete Moment Resisting Frames in Western Canada

Authors: Ali Naghshineh, Ashutosh Bagchi

Abstract:

Performance-based seismic design concepts are increasingly being adopted in various jurisdictions. While the National Building Code of Canada (NBCC) is not fully performance-based, it provides some features of a performance-based code, such as displacement control and objective-based solutions. Performance evaluation is an important part of a performance-based design. In this paper, the seismic performance of a set of code-designed 4, 8 and 12 story moment resisting concrete frames located in Victoria, BC, in the western part of Canada at different hazard levels namely, SLE (Service Level Event), DLE (Design Level Event) and MCE (Maximum Considered Event) has been studied. The seismic performance of these buildings has been evaluated based on FEMA 356 and ATC 72 procedures, and the nonlinear time history analysis. Pushover analysis has been used to investigate the different performance levels of these buildings and adjust their design based on the corresponding target displacements. Since pushover analysis ignores the higher mode effects, nonlinear dynamic time history using a set of ground motion records has been performed. Different types of ground motion records, such as crustal and subduction earthquake records have been used for the dynamic analysis to determine their effects. Results obtained from push over analysis on inter-story drift, displacement, shear and overturning moment are compared to those from the dynamic analysis.

Keywords: seismic performance., performance-based design, concrete moment resisting frame, crustal earthquakes, subduction earthquakes

Procedia PDF Downloads 249
13745 Language Shapes Thought: An Experimental Study on English and Mandarin Native Speakers' Sequencing of Size

Authors: Hsi Wei

Abstract:

Does the language we speak affect the way we think? This question has been discussed for a long time from different aspects. In this article, the issue is examined with an experiment on how speakers of different languages tend to do different sequencing when it comes to the size of general objects. An essential difference between the usage of English and Mandarin is the way we sequence the size of places or objects. In English, when describing the location of something we may say, for example, ‘The pen is inside the trashcan next to the tree at the park.’ In Mandarin, however, we would say, ‘The pen is at the park next to the tree inside the trashcan.’ It’s clear that generally English use the sequence of small to big while Mandarin the opposite. Therefore, the experiment was conducted to test if the difference of the languages affects the speakers’ ability to do the different sequencing. There were two groups of subjects; one consisted of English native speakers, another of Mandarin native speakers. Within the experiment, three nouns were showed as a group to the subjects as their native languages. Before they saw the nouns, they would first get an instruction of ‘big to small’, ‘small to big’, or ‘repeat’. Therefore, the subjects had to sequence the following group of nouns as the instruction they get or simply repeat the nouns. After completing every sequencing and repetition in their minds, they pushed a button as reaction. The repetition design was to gather the mere reading time of the person. As the result of the experiment showed, English native speakers reacted more quickly to the sequencing of ‘small to big’; on the other hand, Mandarin native speakers reacted more quickly to the sequence ‘big to small’. To conclude, this study may be of importance as a support for linguistic relativism that the language we speak do shape the way we think.

Keywords: language, linguistic relativism, size, sequencing

Procedia PDF Downloads 264
13744 The Strategic Entering Time of a Commerce Platform

Authors: Chia-li Wang

Abstract:

The surge of service and commerce platforms, such as e-commerce and internet-of-things, have rapidly changed our lives. How to avoid the congestion and get the job done in the platform is now a common problem that many people encounter every day. This requires platform users to make decisions about when to enter the platform. To that end, we investigate the strategic entering time of a simple platform containing random numbers of buyers and sellers of some item. Upon a trade, the buyer and the seller gain respective profits, yet they pay the cost of waiting in the platform. To maximize their expected payoffs from trading, both buyers and sellers can choose their entering times. This creates an interesting and practical framework of a game that is played among buyers, among sellers, and between them. That is, a strategy employed by a player is not only against players of its type but also a response to those of the other type, and, thus, a strategy profile is composed of strategies of buyers and sellers. The players' best response, the Nash equilibrium (NE) strategy profile, is derived by a pair of differential equations, which, in turn, are used to establish its existence and uniqueness. More importantly, its structure sheds valuable insights of how the entering strategy of one side (buyers or sellers) is affected by the entering behavior of the other side. These results provide a base for the study of dynamic pricing for stochastic demand-supply imbalances. Finally, comparisons between the social welfares (the sum of the payoffs incurred by individual participants) obtained by the optimal strategy and by the NE strategy are conducted for showing the efficiency loss relative to the socially optimal solution. That should help to manage the platform better.

Keywords: double-sided queue, non-cooperative game, nash equilibrium, price of anarchy

Procedia PDF Downloads 71
13743 Ranking the Factors That Influence the Construction Project Success: The Jordanian Perspective

Authors: Ghanim A. Bekr

Abstract:

Project success is what must be done for the project to be acceptable to the client, stakeholders and end-users who will be affected by the project. The study of project success and the critical success factors (CSFs) are the means adopted to improve the effectiveness of project. This research is conducted to make an attempt to identify which variables influence the success of project implementation. This study has selected, through an extensive literature review and interviews, (83) factors categorized in (7) groups that the questionnaire respondents were asked to score. The responses from 66 professionals with an average of 15 years of experience in different types of construction projects in Jordan were collected and analyzed using SPSS and most important factors for success for various success criteria are presented depending on the relative importance index to rank the categories. The research revealed the significant groups of factors are: Client related factors, Contractor’s related factors, Project Manager (PM) related factors, and Project management related factors. In addition the top ten sub factors are: Assertion of the client towards short time of the project, availability of skilled labor, Assertion of the client towards high level of the quality, capability of the client in taking risk, previous experience of the PM in similar projects, previous experience of the contractor in similar projects, decision making by the client/ the client’s representative at the right time, assertion of client towards low cost of project, experience in project management in previous projects, and flow of the information among parties. The results would be helpful to construction project professionals in taking proactive measures for successful completion of construction projects in Jordan.

Keywords: construction projects, critical success factors, Jordan, project success

Procedia PDF Downloads 140
13742 A Rotating Facility with High Temporal and Spatial Resolution Particle Image Velocimetry System to Investigate the Turbulent Boundary Layer Flow

Authors: Ruquan You, Haiwang Li, Zhi Tao

Abstract:

A time-resolved particle image velocimetry (PIV) system is developed to investigate the boundary layer flow with the effect of rotating Coriolis and buoyancy force. This time-resolved PIV system consists of a 10 Watts continuous laser diode and a high-speed camera. The laser diode is able to provide a less than 1mm thickness sheet light, and the high-speed camera can capture the 6400 frames per second with 1024×1024 pixels. The whole laser and the camera are fixed on the rotating facility with 1 radius meters and up to 500 revolutions per minute, which can measure the boundary flow velocity in the rotating channel with and without ribs directly at rotating conditions. To investigate the effect of buoyancy force, transparent heater glasses are used to provide the constant thermal heat flux, and then the density differences are generated near the channel wall, and the buoyancy force can be simulated when the channel is rotating. Due to the high temporal and spatial resolution of the system, the proper orthogonal decomposition (POD) can be developed to analyze the characteristic of the turbulent boundary layer flow at rotating conditions. With this rotating facility and PIV system, the velocity profile, Reynolds shear stress, spatial and temporal correlation, and the POD modes of the turbulent boundary layer flow can be discussed.

Keywords: rotating facility, PIV, boundary layer flow, spatial and temporal resolution

Procedia PDF Downloads 161
13741 Velocity Profiles of Vowel Perception by Javanese and Sundanese English Language Learners

Authors: Arum Perwitasari

Abstract:

Learning L2 sounds is influenced by the first language (L1) sound system. This current study seeks to examine how the listeners with a different L1 vowel system perceive L2 sounds. The fact that English has a bigger number of vowel inventory than Javanese and Sundanese L1 might cause problems for Javanese and Sundanese English language learners perceiving English sounds. To reveal the L2 sound perception over time, we measured the mouse trajectories related to the hand movements made by Javanese and Sundanese language learners, two of Indonesian local languages. Do the Javanese and Sundanese listeners show higher velocity than the English listeners when they perceive English vowels which are similar and new to their L1 system? The study aims to map the patterns of real-time processing through compatible hand movements to reveal any uncertainties when making selections. The results showed that the Javanese listeners exhibited significantly slower velocity values than the English listeners for similar vowels /I, ɛ, ʊ/ in the 826-1200ms post stimulus. Unlike the Javanese, the Sundanese listeners showed slow velocity values except for similar vowel /ʊ/. For the perception of new vowels /i:, æ, ɜ:, ʌ, ɑː, u:, ɔ:/, the Javanese listeners showed slower velocity in making the lexical decision. In contrast, the Sundanese listeners showed slow velocity only for vowels /ɜ:, ɔ:, æ, I/ indicating that these vowels are hard to perceive. Our results fit well with the second language model representing how the L1 vowel system influences the L2 sound perception.

Keywords: velocity profiles, EFL learners, speech perception, experimental linguistics

Procedia PDF Downloads 201
13740 Context Detection in Spreadsheets Based on Automatically Inferred Table Schema

Authors: Alexander Wachtel, Michael T. Franzen, Walter F. Tichy

Abstract:

Programming requires years of training. With natural language and end user development methods, programming could become available to everyone. It enables end users to program their own devices and extend the functionality of the existing system without any knowledge of programming languages. In this paper, we describe an Interactive Spreadsheet Processing Module (ISPM), a natural language interface to spreadsheets that allows users to address ranges within the spreadsheet based on inferred table schema. Using the ISPM, end users are able to search for values in the schema of the table and to address the data in spreadsheets implicitly. Furthermore, it enables them to select and sort the spreadsheet data by using natural language. ISPM uses a machine learning technique to automatically infer areas within a spreadsheet, including different kinds of headers and data ranges. Since ranges can be identified from natural language queries, the end users can query the data using natural language. During the evaluation 12 undergraduate students were asked to perform operations (sum, sort, group and select) using the system and also Excel without ISPM interface, and the time taken for task completion was compared across the two systems. Only for the selection task did users take less time in Excel (since they directly selected the cells using the mouse) than in ISPM, by using natural language for end user software engineering, to overcome the present bottleneck of professional developers.

Keywords: natural language processing, natural language interfaces, human computer interaction, end user development, dialog systems, data recognition, spreadsheet

Procedia PDF Downloads 287
13739 Distinct Patterns of Resilience Identified Using Smartphone Mobile Experience Sampling Method (M-ESM) and a Dual Model of Mental Health

Authors: Hussain-Abdulah Arjmand, Nikki S. Rickard

Abstract:

The response to stress can be highly heterogenous, and may be influenced by methodological factors. The integrity of data will be optimized by measuring both positive and negative affective responses to an event, by measuring responses in real time as close to the stressful event as possible, and by utilizing data collection methods that do not interfere with naturalistic behaviours. The aim of the current study was to explore short term prototypical responses to major stressor events on outcome measures encompassing both positive and negative indicators of psychological functioning. A novel mobile experience sampling methodology (m-ESM) was utilized to monitor both effective responses to stressors in real time. A smartphone mental health app (‘Moodprism’) which prompts users daily to report both their positive and negative mood, as well as whether any significant event had occurred in the past 24 hours, was developed for this purpose. A sample of 142 participants was recruited as part of the promotion of this app. Participants’ daily reported experience of stressor events, levels of depressive symptoms and positive affect were collected across a 30 day period as they used the app. For each participant, major stressor events were identified on the subjective severity of the event rated by the user. Depression and positive affect ratings were extracted for the three days following the event. Responses to the event were scaled relative to their general reactivity across the remainder of the 30 day period. Participants were first clustered into groups based on initial reactivity and subsequent recovery following a stressor event. This revealed distinct patterns of responding along depressive symptomatology and positive affect. Participants were then grouped based on allocations to clusters in each outcome variable. A highly individualised nature in which participants respond to stressor events, in symptoms of depression and levels of positive affect, was observed. A complete description of the novel profiles identified will be presented at the conference. These findings suggest that real-time measurement of both positive and negative functioning to stressors yields a more complex set of responses than previously observed with retrospective reporting. The use of smartphone technology to measure individualized responding also proved to shed significant insight.

Keywords: depression, experience sampling methodology, positive functioning, resilience

Procedia PDF Downloads 221
13738 The Use of Artificial Intelligence in Digital Forensics and Incident Response in a Constrained Environment

Authors: Dipo Dunsin, Mohamed C. Ghanem, Karim Ouazzane

Abstract:

Digital investigators often have a hard time spotting evidence in digital information. It has become hard to determine which source of proof relates to a specific investigation. A growing concern is that the various processes, technology, and specific procedures used in the digital investigation are not keeping up with criminal developments. Therefore, criminals are taking advantage of these weaknesses to commit further crimes. In digital forensics investigations, artificial intelligence is invaluable in identifying crime. It has been observed that an algorithm based on artificial intelligence (AI) is highly effective in detecting risks, preventing criminal activity, and forecasting illegal activity. Providing objective data and conducting an assessment is the goal of digital forensics and digital investigation, which will assist in developing a plausible theory that can be presented as evidence in court. Researchers and other authorities have used the available data as evidence in court to convict a person. This research paper aims at developing a multiagent framework for digital investigations using specific intelligent software agents (ISA). The agents communicate to address particular tasks jointly and keep the same objectives in mind during each task. The rules and knowledge contained within each agent are dependent on the investigation type. A criminal investigation is classified quickly and efficiently using the case-based reasoning (CBR) technique. The MADIK is implemented using the Java Agent Development Framework and implemented using Eclipse, Postgres repository, and a rule engine for agent reasoning. The proposed framework was tested using the Lone Wolf image files and datasets. Experiments were conducted using various sets of ISA and VMs. There was a significant reduction in the time taken for the Hash Set Agent to execute. As a result of loading the agents, 5 percent of the time was lost, as the File Path Agent prescribed deleting 1,510, while the Timeline Agent found multiple executable files. In comparison, the integrity check carried out on the Lone Wolf image file using a digital forensic tool kit took approximately 48 minutes (2,880 ms), whereas the MADIK framework accomplished this in 16 minutes (960 ms). The framework is integrated with Python, allowing for further integration of other digital forensic tools, such as AccessData Forensic Toolkit (FTK), Wireshark, Volatility, and Scapy.

Keywords: artificial intelligence, computer science, criminal investigation, digital forensics

Procedia PDF Downloads 189
13737 Automatic and High Precise Modeling for System Optimization

Authors: Stephanie Chen, Mitja Echim, Christof Büskens

Abstract:

To describe and propagate the behavior of a system mathematical models are formulated. Parameter identification is used to adapt the coefficients of the underlying laws of science. For complex systems this approach can be incomplete and hence imprecise and moreover too slow to be computed efficiently. Therefore, these models might be not applicable for the numerical optimization of real systems, since these techniques require numerous evaluations of the models. Moreover not all quantities necessary for the identification might be available and hence the system must be adapted manually. Therefore, an approach is described that generates models that overcome the before mentioned limitations by not focusing on physical laws, but on measured (sensor) data of real systems. The approach is more general since it generates models for every system detached from the scientific background. Additionally, this approach can be used in a more general sense, since it is able to automatically identify correlations in the data. The method can be classified as a multivariate data regression analysis. In contrast to many other data regression methods this variant is also able to identify correlations of products of variables and not only of single variables. This enables a far more precise and better representation of causal correlations. The basis and the explanation of this method come from an analytical background: the series expansion. Another advantage of this technique is the possibility of real-time adaptation of the generated models during operation. Herewith system changes due to aging, wear or perturbations from the environment can be taken into account, which is indispensable for realistic scenarios. Since these data driven models can be evaluated very efficiently and with high precision, they can be used in mathematical optimization algorithms that minimize a cost function, e.g. time, energy consumption, operational costs or a mixture of them, subject to additional constraints. The proposed method has successfully been tested in several complex applications and with strong industrial requirements. The generated models were able to simulate the given systems with an error in precision less than one percent. Moreover the automatic identification of the correlations was able to discover so far unknown relationships. To summarize the above mentioned approach is able to efficiently compute high precise and real-time-adaptive data-based models in different fields of industry. Combined with an effective mathematical optimization algorithm like WORHP (We Optimize Really Huge Problems) several complex systems can now be represented by a high precision model to be optimized within the user wishes. The proposed methods will be illustrated with different examples.

Keywords: adaptive modeling, automatic identification of correlations, data based modeling, optimization

Procedia PDF Downloads 384
13736 A Case Study on Post-Occupancy Evaluation of User Satisfaction in Higher Educational Buildings

Authors: Yuanhong Zhao, Qingping Yang, Andrew Fox, Tao Zhang

Abstract:

Post-occupancy evaluation (POE) is a systematic approach to assess the actual building performance after the building has been occupied for some time. In this paper, a structured POE assessment was conducted using the building use survey (BUS) methodology in two higher educational buildings in the United Kingdom. This study aims to help close the building performance gap, provide optimized building operation suggestions, and to improve occupants’ satisfaction level. In this research, the questionnaire survey investigated the influences of environmental factors on user satisfaction from the main aspects of building overall design, thermal comfort, perceived control, indoor environment quality for noise, lighting, ventilation, and other non-environmental factors, such as the background information about age, sex, time in buildings, workgroup size, and so on. The results indicate that the occupant satisfaction level with the main aspects of building overall design, indoor environment quality, and thermal comfort in summer and winter on both two buildings, which is lower than the benchmark data. The feedback of this POE assessment has been reported to the building management team to allow managers to develop high-performance building operation plans. Finally, this research provided improvement suggestions to the building operation system to narrow down the performance gap and improve the user work experience satisfaction and productivity level.

Keywords: building performance assessment systems, higher educational buildings, post-occupancy evaluation, user satisfaction

Procedia PDF Downloads 131