0
RESEARCH PAPERS

Prediction of Cavitation in Journal Bearings Using a Boundary Element Method

[+] Author and Article Information
Qiulin Yu, Theo G. Keith

Department of Mechanical Engineering, University of Toledo, Toledo, OH 43606

J. Tribol 117(3), 411-421 (Jul 01, 1995) (11 pages) doi:10.1115/1.2831269 History: Received May 13, 1993; Revised June 11, 1994; Online January 24, 2008

Abstract

In this paper, a boundary element cavitation algorithm is utilized to predict cavitation in journal bearings with axially variable clearance. Film rupture and reformation boundary conditions, obtained from the JFO theory, are directly combined with the generalized boundary integral equation which is derived from Elrod’s universal differential equation. The two boundaries are simulated by two confluent interpolation polynomials: The governing equation is transformed into an undetermined boundary problem. The procedure effectively eliminates the phenomenon of solution oscillation experienced by finite difference cavitation algorithms and caused by unadaptable grid shape and density. It also eliminates the discontinuous derivative of the fractional film content. The results for aligned and misaligned journal bearings are compared with those obtained using the finite difference method. Tapered, barrel, and hourglass Journal bearings are also analyzed. The computational results demonstrate the effects of the journal geometric parameters on journal bearing performance.

Copyright © 1995 by The American Society of Mechanical Engineers
Your Session has timed out. Please sign back in to continue.

References

Figures

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In