swiftfor-loopfloating-pointuiimageview

Custom Review star images not showing properly


I have taken 5 imageviews in storyboard and connected outlets like this:

@IBOutlet var starRatingImageViews: [UIImageView]!

and I am getting string value from JSON for rating like this:

rating = "4.2"

for eg. rating = "4.2" I need to show 4 full stars and one half star but it’s not showing properly.

o/p: here I have "4.2" rating but its stars showing wrongly in output.

enter image description here

code: here user.rating = "4.2" but showing wrong stars in o/p. where am I wrong in fillStars method?

 @IBOutlet var starRatingImageViews: [UIImageView]!

 if let rating = Float(user.rating ?? "0.0") {
     rating.fillStars(in: starRatingImageViews)
 }

 func fillStars(in imageViews: [UIImageView], withEmptyStar color: UIColor = .darkGray) {
    for i in (0 ..< imageViews.count) {
        imageViews[i].contentMode = .scaleAspectFit
        let review = Int(Float(self))
        if (i < review) {
            imageViews[i].image = UIImage(systemName: "star.fill")?.withRenderingMode(.alwaysOriginal).withTintColor(CommonColor.yellowColor)
        } else {
            if (self - Float(review)) > (0.5), i == review {
                imageViews[i].image = UIImage(systemName: "star.fill")?.withRenderingMode(.alwaysOriginal).withTintColor(CommonColor.yellowColor)
            } else if (self - Float(review)) <= (0.5), (self - Float(review)) > (0.0), i == review {
                imageViews[i].image = UIImage(systemName: "star.leadinghalf.fill")?.withRenderingMode(.alwaysOriginal).withTintColor(CommonColor.yellowColor)
            } else {
                imageViews[i].image = UIImage(systemName: "star")?.withRenderingMode(.alwaysOriginal).withTintColor(color)
            }
        }
    }
}

Solution

  • The problem is that you're using self - Float(review) to figure out what kind of star to draw, but self is independent of your loop, and you're not even consistent about it, since you use i < review as the first test. You can make that work, but it's harder than it needs to be. The matter is confused somewhat more by having more cases than you need: You either need to draw a filled star, a half-filled star, or an empty star. That's 3 cases, but you have 4 in your code, because you duplicate the full star case.

    In comments you clarified that let review = Int(Float(self)) is converting from String. So let's look at what happens when self is supposed to be "4.2" and i is 3, which is where it incorrectly computes the partial star. In this case review will be 4, so if i < review should be true since 3 is less than 4, so it should set imageViews[i] to a full star, but it doesn't actually do that. This tells me that self is not "4.2", and therefore review is not 4. Either print self or put a break-point on the if to see what self and review actually are. That will probably show you your bug. Based on that, I think your assumptions about what self is are likely incorrect somehow.

    Even after fixing that, though, fillStars still has a bug, because when i is 4, that is on the last star, it completely fills it, but it should be empty.

    Nonetheless, fillStars could be simpler.

    When I have problems like this, I prefer to think in terms of calculating the percent of the current indicator (star, in your case) that needs to be drawn, and then use that result to decide what to do, rather than mixing the calculation and decision-making together. So I would modify your code to be something like this:

    func clamp<T: Comparable>(_ value: T, to range: ClosedRange<T>) -> T {
        return max(range.lowerBound, min(range.upperBound, value))
    }
    
    func fillStars(
        in imageViews: [UIImageView],
        withEmptyStar color: UIColor = .darkGray)
    {
        let review = Float(self) // Is this needed now?
    
        for i in imageViews.indices
        {
            imageViews[i].contentMode = .scaleAspectFit
            let tintColor: UIColor
            let starName: String
            
            // Calculate the percentage of the current star to draw
            let percentOfThisStar = clamp(review - Float(i), to: 0...1)
    
            // Now use the percentage to pick the image/tinting
            if percentOfThisStar == 0
            { // Empty star
                tintColor = color
                starName  = "star"
            }
            else
            {
                tintColor = CommonColor.yellowColor
                starName  = percentOfThisStar < 0.5
                    ? "star.leadinghalf.fill" // Partial star
                    : "star.fill"             // Full star
            }
            
            imageViews[i].image = UIImage(systemName: starName)?
                .withRenderingMode(.alwaysOriginal)
                .withTintColor(tintColor)
    
        }
    }
    

    As a suggestion, I'd probably move fillStars to another part of your code base. I'm not exactly sure what self is (Float? Double? Decimal?), except that it's some numeric type, and it seems fillStars is a method in an extension on whatever that numeric type is. I'd probably just make it free function instead, or maybe a method in whatever view contains the stars, and pass the review in as a parameter. My reasoning is just that filling stars doesn't have much to do with being any of the numeric types that self might be. It's just a special case usage for your app, probably just for one specific view in your app, so why not put it in that view? Or as a top-level app-specific function? Obviously that's a judgement call. The compiler doesn't care, so it's just a matter of what makes the most sense for you and your team.

    To make life a little easier to test it without creating a full-on app (and to avoid having to use Simulator), I wrote it like this in a command line tool:

    enum StarType: String
    {
        case empty   = "[  ]"
        case partial = "[* ]"
        case filled  = "[**]"
    }
    
    func clamp<T: Comparable>(_ value: T, to range: ClosedRange<T>) -> T {
        return max(range.lowerBound, min(range.upperBound, value))
    }
    
    
    func fillStars(
        in stars: inout [StarType],
        from rating: Float)
    {
        assert(Float(stars.count) >= rating)
        
        for i in stars.indices
        {
            // Calculate the percentage of the current star to draw
            let percentOfThisStar = clamp(rating - Float(i), to: 0...1)
    
            // Now use the percentage to set the star type
            let starType: StarType
            if percentOfThisStar == 0
            { // Empty star
                starType = .empty
            }
            else
            {
                starType  = percentOfThisStar < 0.5
                    ? .partial
                    : .filled
            }
            
            // This could have been done in the `if`, but doing it here allows
            // easier refactoring to UIImageView, if that's desired.
            stars[i] = starType
        }
    }
    

    I renamed review to be rating, and pass it in as a parameter. Instead of a filled star, I print [**], for a partial star I print [* ] and for an empty star I print [ ].

    I drive it with this code:

    var stars: [StarType] = .init(repeating: .empty, count: 5)
    let rating: Float = 4.2
    fillStars(in: &stars, from: rating)
    let starStr = stars.map { $0.rawValue }.joined()
    
    print("rating = \(rating), stars = \(starStr)")
    

    These are the results:

    rating = 4.2, stars = [**][**][**][**][* ]
    

    Which appears to be the correct result.

    As mentioned above, based on what you've described, I think there is at least one bug that happens before calling fillStars.