Better image coloring logic/algorithm

I am developing an iOS app in which the user can change a part of an image’s color, say a Tea Cup, by touching it. I am using Floodfill algorithm to fill colors so that the user has to tap on the Tea Cup to change its color. That’s working fine. But, the final color looks little different than the replacement color. I have some problem finding out a better logic to convert the object’s(Tea Cup) color to the selected color considering its saturation & lightness.

I am using the following logic to get the result color. I am representing color as (hue, saturation, value).

  • Is there a color picker library/code for iPhone development?
  • iOS7 - Is it possible to change status bar color?
  • How to efficiently and fast blur an image on the iPhone?
  • Adding Colours (Colors) Together like Paint (Blue + Yellow = Green, etc)
  • How to tell Smart Invert in iOS 11 not to invert my app colors and detect if it is enabled?
  • Color in storyboard not matching UIColor
  • touchedColor = (tchd_h, tchd_s, tchd_v); // I am not using this now 
    pixelColor = (old_h, old_s, old_v); 
    replacementColor = (new_h, new_s, new_v);
    resultColor = (new_h, new_s, old_v);
    pixelColor = resultColor;

    The cup before painting (circled with red color).

    enter image description here

    The selected replacementColor.

    enter image description here

    Cup after painting the replacementColor (circled with red color).

    enter image description here

    See the final image above. As I am just changing only the hue & saturation, and not the value of the pixelColor, the applied color doesn’t look similar to the selected replacementColor. The lightness of the image remains unaltered.

    If I change the value along with hue & saturation like this,

    resultColor = (new_h, new_s, new_v);
    pixelColor = resultColor;

    Then the cup becomes flat colored, missing the lights & shades like this,

    enter image description here

    I want some idea to tweak the above logic to change the pixel color into a matching replacement color. May be some formula to derive the saturation & value.

    Solutions Collect From Internet About “Better image coloring logic/algorithm”

    In your example, let’s call the pink color the “Color To Replace,” and let’s call the brown color the “Replacement Color.” For each pixel in the destination, find the corresponding pixel in the source. See how it varies from the “Color to Replace”. Now make similar adjustments to the “Replacement Color” and use that as the color at the current output pixel.

    As an example, if the current source pixel is darker than the color to replace by 5 “v” units, then set the output pixel to the replacement color made darker by 5 “v” units. (And you’d want to make the same adjustments in hue and saturation, as well.)

    You’ll probably need to limit the range of colors you adjust so you aren’t turning other objects a different color.